Science.gov

Sample records for quantitative temporal analysis

  1. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  2. Microscopy environment for quantitative spatial and temporal analysis of multicellular interactions

    NASA Astrophysics Data System (ADS)

    Sudar, Damir; Parvin, Bahram; Callahan, Daniel E.; Schwarz, Richard I.; Knowles, David W.; Ortiz de Solorzano, Carlos; Barcellos-Hoff, Mary H.

    2002-05-01

    Quantitative analysis of spatial and temporal concurrent responses of multiple markers in 3-dimensional cell cultures is hampered by the routine mode of sequential image acquisition, measurement and analysis of specific targets. A system was developed for detailed analysis of multi-dimensional, time-sequence responses and in order to relate features in novel and meaningful ways that will further our understanding of basic biology. Optical sectioning of the 3-dimensional structures is achieved with structured light illumination using the Wilson grating as described by Lanni. The automated microscopy system can image multicellular structures and track dynamic events, and is equipped for simultaneous/ sequential imaging of multiple fluorescent markers. Computer-controlled perfusion of external stimuli into the culture system allows (i) real-time observations of multiple cellular responses and (ii) automatic and intelligent adjustment of experimental parameters. This creates a feedback loop in real-time that directs desired responses in a given experiment. On-line image analysis routines provide cell-by-cell measurement results through segmentation and feature extraction (i.e. intensity, localization, etc.), and quantitation of meta-features such as dynamic responses of cells or correlations between different cells. Off-line image and data analysis is used to derive models of the processes involved, which will deepen the understanding of the basic biology.

  3. Temporal Evolution of Submarine Channel Trajectory and Mobility: Quantitative Analysis and Comparison to Rivers

    NASA Astrophysics Data System (ADS)

    Auchter, N.; Jobe, Z. R.; Howes, N. C.

    2015-12-01

    Submarine channels are the primary conduits for the transport of clastic detritus from the continents into the deep sea. During their evolution, these channels migrate, forming channel belts that record a complex history of vertical degradation, lateral migration, and vertical aggradation. Previous work drawing on seismic, outcrop, and modeled examples of submarine channel systems have qualitatively observed a significant component of vertical aggradation and a temporal trend transitioning from a phase dominated by lateral migration to one dominated by vertical aggradation. The processes that drive these changes in channel trajectory and their stratigraphic consequences remain poorly constrained. Furthermore, comparisons/contrasts between submarine channels and rivers often do not take into account the variability in channel trajectory/mobility, which clearly affects the resultant architecture. This study compiles a global dataset of >320 channel trajectory and geometry measurements derived from a global suite 2-D cross sections including 21 submarine channel systems and 13 fluvial systems. These data enable the quantitative treatment of the temporal evolution of channel trajectory/mobility as well as highlighting key differences between submarine channels and rivers. Submarine channels show a strong trend of decreasing channel mobility during their evolution, resulting in early stage lateral migration and late stage aggradation. Fluvial channels show a similar evolution, but with ~5x less aggradation and ~2x larger mobility values. Fluvial channel belts contain a large proportion of lateral accretion and have similar trajectories as early phase or degradational submarine channels, while late stage or aggradation submarine channel belts contain significantly more vertically stacked channel fill deposits. These submarine trajectory/mobility measurements provide a valuable quantitative basis for explaining the evolution of stratigraphic architecture of submarine

  4. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  5. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  6. Z-score-based semi-quantitative analysis of the volume of the temporal horn of the lateral ventricle on brain CT images.

    PubMed

    Takahashi, Noriyuki; Kinoshita, Toshibumi; Ohmura, Tomomi; Lee, Yongbum; Matsuyama, Eri; Toyoshima, Hideto; Tsai, Du-Yih

    2016-01-01

    The volume of the temporal horn of the lateral ventricle (THLV) on brain computed tomography (CT) images is important for neurologic diagnosis. Our purpose in this study was to develop a z-score-based semi-quantitative analysis for estimation of the THLV volume by using voxel-based morphometry. The THLV volume was estimated by use of a z-score mapping method that consisted of four main steps: anatomic standardization, construction of a normal reference database, calculation of the z score, and calculation of the mean z score in a volume of interest (VOI). A mean z score of the CT value obtained from a VOI around the THLV was used as an index for the THLV volume. CT scans from 50 subjects were evaluated. For evaluation of the accuracy of this method for estimating the THLV volume, the THLV volume was determined manually by neuroradiologists (serving as the reference volume). A mean z score was calculated from the VOI for each THLV of the 50 subjects by use of the proposed method. The accuracy of this method was evaluated by use of the relationship between the mean z score and the reference volume. The quadratic polynomial regression equation demonstrated a statistically significant correlation between the mean z score and the reference volume of the THLV (R (2) = 0.94; P < 0.0001). In 92 of 100 THLVs (92 %), the 95 % prediction interval of the regional mean z score captured the reference volume of the THLV. The z-score-based semi-quantitative analysis has the potential quantitatively to estimate the THLV volume on CT images.

  7. Quantitative temporal analysis of /sup 99m/Technetium p-isopropyl-iminodiacetic acid (PIPIDA) as a measure of hepatic function in health and disease

    SciTech Connect

    Joshi, S.N.; George, E.A.; Perrillo, R.P.

    1981-01-01

    Excretory liver function was analyzed in 10 healthy volunteers and 28 subjects with acute or chronic liver injury following intravenous administration of /sup 99m/technetium p-isopropyl iminodiacetic acid. Hepatobiliary transit of this agent was quantitated at 5-min intervals for a total of 60 min. Indices of total liver activity, liver cell uptake, liver parenchymal clearance, and bile duct clearance of /sup 99m/technetium p-isopropyl iminodiacetic acid were calculated from time--activity curves over heart, liver, extrahepatic bile ducts, and gallbladder. Seven subjects with acute viral hepatitis, 15 with extrahepatic biliary obstruction, and 6 with intrahepatic cholestasis were evaluated. Compared with healthy volunteers, a significant (p less than 0.0001) reduction in total liver activity and liver parenchymal clearance was demonstrated in all patient groups. Major resolution in all liver-derived indices, particularly total liver activity, occurred during convalescence from hepatitis and after biliary drainage. Nonmeasurable bile duct clearance always indicated a diagnosis of extrahepatic obstruction in cholestatic subjects, and this index normalized in subjects following biliary drainage. Whereas visual assessment of /sup 99m/technetium p-isopropyl iminodiacetic acid scans provided limited, useful information about the functional status of the liver, quantitative temporal analysis proved to be a much more effective technique.

  8. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    PubMed

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (P<0.05) and tended to be perceived with higher intensities of flavor (P<0.10), pepper (P<0.20), and oiliness (P<0.20), while resulting lower in chewiness (P<0.20). TDS showed that in all the sausages hardness was the first dominant attribute; then, in NNS products flavor remained dominant until the end of tasting, whereas in NS products oiliness prevailed. In conclusion, TDS showed that the perception of some textural parameters, such as oiliness, during mastication was more dominant in NS products, whereas using conventional QDA this attribute appeared higher in sausages manufactured without preservatives. Therefore, TDS provided additional information for the description and differentiation of Lucanian sausages. PMID:27486959

  9. Temporal mapping and analysis

    NASA Technical Reports Server (NTRS)

    O'Hara, Charles G. (Inventor); Shrestha, Bijay (Inventor); Vijayaraj, Veeraraghavan (Inventor); Mali, Preeti (Inventor)

    2011-01-01

    A compositing process for selecting spatial data collected over a period of time, creating temporal data cubes from the spatial data, and processing and/or analyzing the data using temporal mapping algebra functions. In some embodiments, the temporal data cube is creating a masked cube using the data cubes, and computing a composite from the masked cube by using temporal mapping algebra.

  10. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  11. Quantitative methods for stochastic high frequency spatio-temporal and non-linear analysis: Assessing health effects of exposure to extreme ambient temperature

    NASA Astrophysics Data System (ADS)

    Liss, Alexander

    Extreme weather events, such as heat waves and cold spells, cause substantial excess mortality and morbidity in the vulnerable elderly population, and cost billions of dollars. The accurate and reliable assessment of adverse effects of extreme weather events on human health is crucial for environmental scientists, economists, and public health officials to ensure proper protection of vulnerable populations and efficient allocation of scarce resources. However, the methodology for the analysis of large national databases is yet to be developed. The overarching objective of this dissertation is to examine the effect of extreme weather on the elderly population of the Conterminous US (ConUS) with respect to seasonality in temperature in different climatic regions by utilizing heterogeneous high frequency and spatio-temporal resolution data. To achieve these goals the author: 1) incorporated dissimilar stochastic high frequency big data streams and distinct data types into the integrated data base for use in analytical and decision support frameworks; 2) created an automated climate regionalization system based on remote sensing and machine learning to define climate regions for the Conterminous US; 3) systematically surveyed the current state of the art and identified existing gaps in the scientific knowledge; 4) assessed the dose-response relationship of exposure to temperature extremes on human health in relatively homogeneous climate regions using different statistical models, such as parametric and non-parametric, contemporaneous and asynchronous, applied to the same data; 5) assessed seasonal peak timing and synchronization delay of the exposure and the disease within the framework of contemporaneous high frequency harmonic time series analysis and modification of the effect by the regional climate; 6) modeled using hyperbolic functional form non-linear properties of the effect of exposure to extreme temperature on human health. The proposed climate

  12. Quantitative Characterization of Tissue Microstructure with Temporal Diffusion Spectroscopy

    PubMed Central

    Xu, Junzhong; Does, Mark D.; Gore, John C.

    2009-01-01

    The signals recorded by diffusion-weighted magnetic resonance imaging (DWI) are dependent on the micro-structural properties of biological tissues, so it is possible to obtain quantitative structural information non-invasively from such measurements. Oscillating gradient spin echo (OGSE) methods have the ability to probe the behavior of water diffusion over different time scales and the potential to detect variations in intracellular structure. To assist in the interpretation of OGSE data, analytical expressions have been derived for diffusion-weighted signals with OGSE methods for restricted diffusion in some typical structures, including parallel planes, cylinders and spheres, using the theory of temporal diffusion spectroscopy. These analytical predictions have been confirmed with computer simulations. These expressions suggest how OGSE signals from biological tissues should be analyzed to characterize tissue microstructure, including how to estimate cell nuclear sizes. This approach provides a model to interpret diffusion data obtained from OGSE measurements that can be used for applications such as monitoring tumor response to treatment in vivo. PMID:19616979

  13. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging

    PubMed Central

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used. PMID:26658308

  14. Neurochemical phenotype of corticocortical connections in the macaque monkey: quantitative analysis of a subset of neurofilament protein-immunoreactive projection neurons in frontal, parietal, temporal, and cingulate cortices

    NASA Technical Reports Server (NTRS)

    Hof, P. R.; Nimchinsky, E. A.; Morrison, J. H.; Bloom, F. E. (Principal Investigator)

    1995-01-01

    The neurochemical characteristics of the neuronal subsets that furnish different types of corticocortical connections have been only partially determined. In recent years, several cytoskeletal proteins have emerged as reliable markers to distinguish subsets of pyramidal neurons in the cerebral cortex of primates. In particular, previous studies using an antibody to nonphosphorylated neurofilament protein (SMI-32) have revealed a consistent degree of regional and laminar specificity in the distribution of a subpopulation of pyramidal cells in the primate cerebral cortex. The density of neurofilament protein-immunoreactive neurons was shown to vary across corticocortical pathways in macaque monkeys. In the present study, we have used the antibody SMI-32 to examine further and to quantify the distribution of a subset of corticocortically projecting neurons in a series of long ipsilateral corticocortical pathways in comparison to short corticocortical, commissural, and limbic connections. The results demonstrate that the long association pathways interconnecting the frontal, parietal, and temporal neocortex have a high representation of neurofilament protein-enriched pyramidal neurons (45-90%), whereas short corticocortical, callosal, and limbic pathways are characterized by much lower numbers of such neurons (4-35%). These data suggest that different types of corticocortical connections have differential representation of highly specific neuronal subsets that share common neurochemical characteristics, thereby determining regional and laminar cortical patterns of morphological and molecular heterogeneity. These differences in neuronal neurochemical phenotype among corticocortical circuits may have considerable influence on cortical processing and may be directly related to the type of integrative function subserved by each cortical pathway. Finally, it is worth noting that neurofilament protein-immunoreactive neurons are dramatically affected in the course of

  15. Quantitative Temporal Viromics: An Approach to Investigate Host-Pathogen Interaction

    PubMed Central

    Weekes, Michael P.; Tomasec, Peter; Huttlin, Edward L.; Fielding, Ceri A.; Nusinow, David; Stanton, Richard J.; Wang, Eddie C.Y.; Aicheler, Rebecca; Murrell, Isa; Wilkinson, Gavin W.G.; Lehner, Paul J.; Gygi, Steven P.

    2014-01-01

    Summary A systematic quantitative analysis of temporal changes in host and viral proteins throughout the course of a productive infection could provide dynamic insights into virus-host interaction. We developed a proteomic technique called “quantitative temporal viromics” (QTV), which employs multiplexed tandem-mass-tag-based mass spectrometry. Human cytomegalovirus (HCMV) is not only an important pathogen but a paradigm of viral immune evasion. QTV detailed how HCMV orchestrates the expression of >8,000 cellular proteins, including 1,200 cell-surface proteins to manipulate signaling pathways and counterintrinsic, innate, and adaptive immune defenses. QTV predicted natural killer and T cell ligands, as well as 29 viral proteins present at the cell surface, potential therapeutic targets. Temporal profiles of >80% of HCMV canonical genes and 14 noncanonical HCMV open reading frames were defined. QTV is a powerful method that can yield important insights into viral infection and is applicable to any virus with a robust in vitro model. PaperClip PMID:24906157

  16. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  17. Temporal Instabilities in Amblyopic Perception: A Quantitative Approach.

    PubMed

    Thiel, Aylin; Iftime, Adrian

    2016-04-01

    The purpose of this study is to quantify the temporal characteristics of spatial misperceptions in human amblyopia. Twenty-two adult participants with strabismus, strabismic, anisometropic, or mixed amblyopia were asked to describe their subjective percept of static geometrical patterns with different spatial frequencies and shapes, as seen with their non-dominant eye. We generated digital reconstructions of their perception (static images or movies) that were subsequently validated by the subjects using consecutive matching sessions. We calculated the Shannon entropy variation in time for each recorded movie, as a measure of temporal instability. Nineteen of the 22 subjects perceived temporal instabilities that can be broadly classified in two categories. We found that the average frequency of the perceived temporal instabilities is ∼1 Hz. The stimuli with higher spatial frequencies yielded more often temporally unstable perceptions with higher frequencies. We suggest that type and amount of temporal instabilities in amblyopic vision are correlated with the etiology and spatial frequency of the stimulus.

  18. Sobering up: A Quantitative Review of Temporal Declines in Expectations

    ERIC Educational Resources Information Center

    Sweeny, Kate; Krizan, Zlatan

    2013-01-01

    Although people's outlook on the future tends to be characterized by hope and optimism, over time this outlook often becomes more dire. We review multiple theoretical accounts of this tendency to "sober up" as feedback about outcomes draws near, and we explicate factors critical to promoting these temporal declines in expectations. We then…

  19. Temporal Expression-based Analysis of Metabolism

    PubMed Central

    Segrè, Daniel

    2012-01-01

    Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM). We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such “history-dependent” sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques. PMID:23209390

  20. Temporal expression-based analysis of metabolism.

    PubMed

    Collins, Sara B; Reznik, Ed; Segrè, Daniel

    2012-01-01

    Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM). We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such "history-dependent" sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques. PMID:23209390

  1. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  2. Upper limb automatisms differ quantitatively in temporal and frontal lobe epilepsies.

    PubMed

    Silva Cunha, João P; Rémi, Jan; Vollmar, Christian; Fernandes, José M; Gonzalez-Victores, Jose A; Noachtar, Soheyl

    2013-05-01

    We quantitatively evaluated the localizing and lateralizing characteristics of ictal upper limb automatisms (ULAs) in patients with temporal lobe epilepsy (TLE; n=38) and frontal lobe epilepsy (FLE; n=20). Movement speed, extent, length, and duration of ULAs were quantitatively analyzed with motion capturing techniques. Upper limb automatisms had a larger extent (p<0.001), covered more distance (p<0.05), and were faster (p<0.001) in FLE than in TLE. In TLE, the maximum speed of ULAs was higher ipsilaterally than contralaterally (173 vs. 84pixels/s; p=0.02), with no significant difference in FLE (511 vs. 428). The duration of ictal automatisms in relation to the total seizure duration was shorter in TLE than in FLE (median 36% vs. 63%; p<0.001), with no difference in the absolute duration (26s vs. 27s). These results demonstrate that quantitative movement analysis of ULAs differentiates FLE from TLE, which may aid in the localization of the epileptogenic zone. PMID:23545438

  3. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  4. Visual constraints for the perception of quantitative depth from temporal interocular unmatched features.

    PubMed

    Ni, Rui; Chen, Lin; Andersen, George J

    2010-07-21

    Previous research (Brooks & Gillam, 2006) has found that temporal interocular unmatched (IOUM) features generate a perception of subjective contours and can result in a perception of quantitative depth. In the present study we examine in detail the factors important for quantitative depth perception from IOUM features. In Experiments 1 and 2 observers were shown temporal IOUM features based on three dots that disappeared behind an implicit surface. Subjects reported a perception of a subjective surface and were able to perceive qualitative depth. In Experiments 3 and 4 metrical depth was perceived when binocular disparity features were added to the display. These results suggest that quantitative depth from IOUM information is perceived when binocular matched information is present in regions adjacent to the surface. In addition, the perceived depth of the subjective surface decreased with an increase in the width of the subjective surface suggesting a limitation in the propagation of quantitative depth to surface regions where qualitative depth information is available.

  5. Spatial and temporal scene analysis

    NASA Astrophysics Data System (ADS)

    Rollins, J. Michael; Chaapel, Charles; Bleiweiss, Max P.

    1994-06-01

    Current efforts to design reliable background scene generation programs require validation using real images for comparison. A crucial step in making objective comparisons is to parameterize the real and generated images into a common set of feature metrics. Such metrics can be derived from statistical and transform-based analyses and yield information about the structures and textures present in various image regions of interest. This paper presents the results of such a metrics-development process for the smart weapons operability enhancement (SWOE) joint test and evaluation (JT&E) program. Statistical and transform based techniques were applied to images obtained from two separate locations, Grayling, Michigan and Yuma, Arizona, at various times of day and under a variety of environmental conditions. Statistical analyses of scene radiance distributions and `clutter' content were performed both spatially and temporally. Fourier and wavelet transform methods were applied as well. Results and their interpretations are given for the image analyses. The metrics that provide the clearest and most reliable distinction between feature classes are recommended.

  6. Quantitative phase imaging of biological cells using spatially low and temporally high coherent light source.

    PubMed

    Ahmad, Azeem; Dubey, Vishesh; Singh, Gyanendra; Singh, Veena; Mehta, Dalip Singh

    2016-04-01

    In this Letter, we demonstrate quantitative phase imaging of biological samples, such as human red blood cells (RBCs) and onion cells using narrow temporal frequency and wide angular frequency spectrum light source. This type of light source was synthesized by the combined effect of spatial, angular, and temporal diversity of speckle reduction technique. The importance of using low spatial and high temporal coherence light source over the broad band and narrow band light source is that it does not require any dispersion compensation mechanism for biological samples. Further, it avoids the formation of speckle or spurious fringes which arises while using narrow band light source. PMID:27192285

  7. Temporal fringe pattern analysis with parallel computing

    SciTech Connect

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-11-20

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis.

  8. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization.

  9. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  10. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  11. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  12. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  13. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  14. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  15. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures.

  16. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  17. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  18. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  19. Temporal analysis of multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Richardson, A. J.; Wiegand, C. L.; Torline, R. J.

    1973-01-01

    Multispectral scanner reflectance data were sampled for bare soil, cotton, sorghum, corn, and citrus at four dates during a growing season (April, May, June, and July 1969) to develop a time-dependent signature for crop and soil discrimination. Discrimination tests were conducted for single-date and multidate formats using training and test data sets. For classifications containing several crops, the multidate or temporal approach improved discrimination compared with the single-date approach. The multidate approach also preserved recognition accuracy better in going from training fields to test fields than the single-date analysis. The spectral distinctiveness of bare soil versus vegetation resulted in essentially equal discrimination using single-date versus multidate data for those two categories.

  20. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  1. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  2. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  3. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  4. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  5. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  6. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  7. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  8. Quantitative image analysis of celiac disease.

    PubMed

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  9. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  10. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  11. Spatial and temporal epidemiological analysis in the Big Data era.

    PubMed

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  12. Spatial and temporal epidemiological analysis in the Big Data era.

    PubMed

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  13. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  15. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  16. Temporal Mixture Analysis of Hypertemporal Antarctic Sea Ice Data in the Sense of Machine Learning

    NASA Astrophysics Data System (ADS)

    Chi, J.; Kim, H. C.

    2015-12-01

    Hypertemporal data, or time series acquired at high temporal frequencies, are often used to determine seasonal characteristics of environmental phenomena such as sea ice concentration. However, it is difficult to analyze long-term hypertemporal remote sensing data over extensive areas without prior information. Most pixels of hypertemporal data are highly mixed and contain several distinct temporal signals that represent seasonal characteristics of substances. This study performed temporal mixture analysis, which is algebraically similar to spectral mixture analysis, but occurs in the time domain instead of the spectral domain. Temporal mixture analysis was used to investigate the temporal characteristics of Antarctic sea ice. Two general steps were used to address mixing problems: 1) finding temporally unique signatures of pure components, which are referred to as temporal endmembers, and 2) unmixing each pixel in the time series data as a linear combination of the endmember fractional abundances. Because endmember selection is critical to the success of both spectral and temporal mixture analysis, it is important to select proper endmembers from large quantities of hypertemporal data. A machine learning algorithm was introduced to successfully identify endmembers without prior knowledge. A fully linear mixing model was then implemented in an attempt to produce more robust and physically meaningful abundance estimates. Experiments that quantitatively and qualitatively evaluated the proposed approaches were conducted. A temporal mixture analysis of high-temporal-dimensional data provides a unique summary of long-term Antarctic sea ice and noise-whitened reconstruction images via inverse processing. Further, comparisons of regional sea-ice fractions provide a better understanding of the overall Antarctic sea ice changes.

  17. Temporal quantitative proteomics of Saccharomyces cerevisiae in response to a nonlethal concentration of furfural.

    PubMed

    Lin, Feng-Ming; Tan, Ying; Yuan, Ying-Jin

    2009-12-01

    Furfural, one of the main inhibitory compounds in lignocellulosic hydrolytes, inhibits the growth and ethanol production rate of yeast. To get a global view of the dynamic expression pattern of proteins in Saccharomyces cerevisiae during the fermentation with the introduction of 8 g/L furfural, the protein samples were taken before the addition of furfural, during the initial phase of furfural conversion and immediately after the conversion of furfural for comparative proteomic analysis with iTRAQ on a LC-ESI-MS/MS instrument. A comparison of the temporal expression pattern of 107 proteins related to protein synthesis between the reference cultures and the furfural-treated cultures showed that a temporal downregulation of these proteins was retarded after the addition of furfural. The expression levels of 20 enzymes in glucose fermentation and 5 enzymes in the tricarboxylic acid cycle were reduced by furfural, with notably delayed temporal downregulations of Glk1p, Tdh1p, Eno1p and Aco1p, which is correlated to the reduced ethanol formation rate and glucose consumption rate by 66.7 and 60.4%, respectively. In the presence of furfural, proteins catalyzing the upper part of the super pathway of sulfur amino acid biosynthesis were repressed at all time points, which is related to the inhibited growth of furfural-treated yeast. The expressions of 18 proteins related to stress response showed increased trends, including several highly induced heat shock proteins and proteins related to cellular signaling pathways.

  18. Spatio-temporal analysis of environmental radiation in Korea

    SciTech Connect

    Kim, J.Y.; Lee, B.C.; Shin, H.K.

    2007-07-01

    Geostatistical visualization of environmental radiation is a very powerful approach to explore and understand spatio-temporal variabilities of environmental radiation data. Spatial patterns of environmental radiation can be described quantitatively in terms of variogram and kriging, which are based on the idea that statistical variation of data are functions of distance. (authors)

  19. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  20. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described. PMID:24136541

  1. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  2. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  3. Spatial, temporal, and quantitative manipulation of intracellular hydrogen peroxide in cultured cells.

    PubMed

    Alim, Ishraq; Haskew-Layton, Renee E; Aleyasin, Hossein; Guo, Hengchang; Ratan, Rajiv R

    2014-01-01

    Hydrogen peroxide (H2O2) is produced endogenously in a number of cellular compartments, including the mitochondria, the endoplasmic reticulum, peroxisomes, and at the plasma membrane, and can play divergent roles as a second messenger or a pathological toxin. It is assumed that the tuned production of H2O2 within neuronal and nonneuronal cells regulates a discreet balance between survival and death. However, a major challenge in understanding the physiological versus pathological role of H2O2 in cells has been the lack of validated methods that can spatially, temporally, and quantitatively modulate H2O2 production. A promising means of regulating endogenous H2O2 is through the expression of peroxide-producing enzyme d-amino acid oxidase (DAAO from Rhodotorula gracilis lacking a peroxisomal targeting sequence). Using viral vectors to express DAAO in distinct cell types and using targeting sequences to target DAAO to distinct subcellular sites, we can manipulate H2O2 production by applying the substrate d-alanine or permeable analogs of d-alanine. In this chapter, we describe the use of DAAO to produce H2O2 in culture models and the real-time visual validation of this technique using two-photon microscopy and chemoselective fluorescent probes.

  4. Spectral and Temporal Laser Fluorescence Analysis Such as for Natural Aquatic Environments

    NASA Technical Reports Server (NTRS)

    Chekalyuk, Alexander (Inventor)

    2015-01-01

    An Advanced Laser Fluorometer (ALF) can combine spectrally and temporally resolved measurements of laser-stimulated emission (LSE) for characterization of dissolved and particulate matter, including fluorescence constituents, in liquids. Spectral deconvolution (SDC) analysis of LSE spectral measurements can accurately retrieve information about individual fluorescent bands, such as can be attributed to chlorophyll-a (Chl-a), phycobiliprotein (PBP) pigments, or chromophoric dissolved organic matter (CDOM), among others. Improved physiological assessments of photosynthesizing organisms can use SDC analysis and temporal LSE measurements to assess variable fluorescence corrected for SDC-retrieved background fluorescence. Fluorescence assessments of Chl-a concentration based on LSE spectral measurements can be improved using photo-physiological information from temporal measurements. Quantitative assessments of PBP pigments, CDOM, and other fluorescent constituents, as well as basic structural characterizations of photosynthesizing populations, can be performed using SDC analysis of LSE spectral measurements.

  5. Empirical Bayes Analysis of Quantitative Proteomics Experiments

    PubMed Central

    Margolin, Adam A.; Ong, Shao-En; Schenone, Monica; Gould, Robert; Schreiber, Stuart L.; Carr, Steven A.; Golub, Todd R.

    2009-01-01

    Background Advances in mass spectrometry-based proteomics have enabled the incorporation of proteomic data into systems approaches to biology. However, development of analytical methods has lagged behind. Here we describe an empirical Bayes framework for quantitative proteomics data analysis. The method provides a statistical description of each experiment, including the number of proteins that differ in abundance between 2 samples, the experiment's statistical power to detect them, and the false-positive probability of each protein. Methodology/Principal Findings We analyzed 2 types of mass spectrometric experiments. First, we showed that the method identified the protein targets of small-molecules in affinity purification experiments with high precision. Second, we re-analyzed a mass spectrometric data set designed to identify proteins regulated by microRNAs. Our results were supported by sequence analysis of the 3′ UTR regions of predicted target genes, and we found that the previously reported conclusion that a large fraction of the proteome is regulated by microRNAs was not supported by our statistical analysis of the data. Conclusions/Significance Our results highlight the importance of rigorous statistical analysis of proteomic data, and the method described here provides a statistical framework to robustly and reliably interpret such data. PMID:19829701

  6. Temporal codes and computations for sensory representation and scene analysis.

    PubMed

    Cariani, Peter A

    2004-09-01

    This paper considers a space of possible temporal codes, surveys neurophysiological and psychological evidence for their use in nervous systems, and presents examples of neural timing networks that operate in the time-domain. Sensory qualities can be encoded temporally by means of two broad strategies: stimulus-driven temporal correlations (phase-locking) and stimulus-triggering of endogenous temporal response patterns. Evidence for stimulus-related spike timing patterns exists in nearly every sensory modality, and such information can be potentially utilized for representation of stimulus qualities, localization of sources, and perceptual grouping. Multiple strategies for temporal (time, frequency, and code-division) multiplexing of information for transmission and grouping are outlined. Using delays and multiplications (coincidences), neural timing networks perform time-domain signal processing operations to compare, extract and separate temporal patterns. Separation of synthetic double vowels by a recurrent neural timing network is used to illustrate how coherences in temporal fine structure can be exploited to build up and separate periodic signals with different fundamentals. Timing nets constitute a time-domain scene analysis strategy based on temporal pattern invariance rather than feature-based labeling, segregation and binding of channels. Further potential implications of temporal codes and computations for new kinds of neural networks are explored.

  7. Forecasting Antarctic Sea Ice Concentrations Using Results of Temporal Mixture Analysis

    NASA Astrophysics Data System (ADS)

    Chi, Junhwa; Kim, Hyun-Cheol

    2016-06-01

    Sea ice concentration (SIC) data acquired by passive microwave sensors at daily temporal frequencies over extended areas provide seasonal characteristics of sea ice dynamics and play a key role as an indicator of global climate trends; however, it is typically challenging to study long-term time series. Of the various advanced remote sensing techniques that address this issue, temporal mixture analysis (TMA) methods are often used to investigate the temporal characteristics of environmental factors, including SICs in the case of the present study. This study aims to forecast daily SICs for one year using a combination of TMA and time series modeling in two stages. First, we identify temporally meaningful sea ice signatures, referred to as temporal endmembers, using machine learning algorithms, and then we decompose each pixel into a linear combination of temporal endmembers. Using these corresponding fractional abundances of endmembers, we apply a autoregressive model that generally fits all Antarctic SIC data for 1979 to 2013 to forecast SIC values for 2014. We compare our results using the proposed approach based on daily SIC data reconstructed from real fractional abundances derived from a pixel unmixing method and temporal endmember signatures. The proposed method successfully forecasts new fractional abundance values, and the resulting images are qualitatively and quantitatively similar to the reference data.

  8. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  9. Analysis of brain patterns using temporal measures

    DOEpatents

    Georgopoulos, Apostolos

    2015-08-11

    A set of brain data representing a time series of neurophysiologic activity acquired by spatially distributed sensors arranged to detect neural signaling of a brain (such as by the use of magnetoencephalography) is obtained. The set of brain data is processed to obtain a dynamic brain model based on a set of statistically-independent temporal measures, such as partial cross correlations, among groupings of different time series within the set of brain data. The dynamic brain model represents interactions between neural populations of the brain occurring close in time, such as with zero lag, for example. The dynamic brain model can be analyzed to obtain the neurophysiologic assessment of the brain. Data processing techniques may be used to assess structural or neurochemical brain pathologies.

  10. Temporal shape analysis via the spectral signature.

    PubMed

    Bernardis, Elena; Konukoglu, Ender; Ou, Yangming; Metaxas, Dimitris N; Desjardins, Benoit; Pohl, Kilian M

    2012-01-01

    In this paper, we adapt spectral signatures for capturing morphological changes over time. Advanced techniques for capturing temporal shape changes frequently rely on first registering the sequence of shapes and then analyzing the corresponding set of high dimensional deformation maps. Instead, we propose a simple encoding motivated by the observation that small shape deformations lead to minor refinements in the spectral signature composed of the eigenvalues of the Laplace operator. The proposed encoding does not require registration, since spectral signatures are invariant to pose changes. We apply our representation to the shapes of the ventricles extracted from 22 cine MR scans of healthy controls and Tetralogy of Fallot patients. We then measure the accuracy score of our encoding by training a linear classifier, which outperforms the same classifier based on volumetric measurements. PMID:23286031

  11. Combining qualitative and quantitative spatial and temporal information in a hierarchical structure: Approximate reasoning for plan execution monitoring

    NASA Technical Reports Server (NTRS)

    Hoebel, Louis J.

    1993-01-01

    The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.

  12. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  13. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  14. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  15. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  16. Network Analysis Using Spatio-Temporal Patterns

    NASA Astrophysics Data System (ADS)

    Miranda, Gisele H. B.; Machicao, Jeaneth; Bruno, Odemir M.

    2016-08-01

    Different network models have been proposed along the last years inspired by real-world topologies. The characterization of these models implies the understanding of the underlying network phenomena, which accounts structural and dynamic properties. Several mathematical tools can be employed to characterize such properties as Cellular Automata (CA), which can be defined as dynamical systems of discrete nature composed by spatially distributed units governed by deterministic rules. In this paper, we proposed a method based on the modeling of one specific CA over distinct network topologies in order to perform the classification of the network model. The proposed methodology consists in the modeling of a binary totalistic CA over a network. The transition function that governs each CA cell is based on the density of living neighbors. Secondly, the distribution of the Shannon entropy is obtained from the evolved spatio-temporal pattern of the referred CA and used as a network descriptor. The experiments were performed using a dataset composed of four different types of networks: random, small-world, scale-free and geographical. We also used cross-validation for training purposes. We evaluated the accuracy of classification as a function of the initial number of living neighbors, and, also, as a function of a threshold parameter related to the density of living neighbors. The results show high accuracy values in distinguishing among the network models which demonstrates the feasibility of the proposed method.

  17. Qualitative, quantitative and temporal study of cutting agents for cocaine and heroin over 9 years.

    PubMed

    Broséus, Julian; Gentile, Natacha; Bonadio Pont, Federica; Garcia Gongora, Juan Manuel; Gasté, Laëtitia; Esseiva, Pierre

    2015-12-01

    Forensic laboratories mainly focus on the qualification and the quantitation of the illicit drug under analysis as both aspects are used for judiciary purposes. Therefore, information related to cutting agents (adulterants and diluents) detected in illicit drugs is limited in the forensic literature. This article discusses the type and frequency of adulterants and diluents detected in more than 6000 cocaine specimens and 3000 heroin specimens, confiscated in western Switzerland from 2006 to 2014. The results show a homogeneous and quite unchanging adulteration for heroin, while for cocaine it could be characterised as heterogeneous and relatively dynamic. Furthermore, the results indicate that dilution affects more cocaine than heroin. Therefore, the results provided by this study tend to reveal differences between the respective structures of production or distribution of cocaine and heroin. This research seeks to promote the systematic analysis of cutting agents by forensic laboratories. Collecting and processing data related to the presence of cutting agents in illicit drug specimens produces relevant information to understand and to compare the structure of illicit drug markets. PMID:26448535

  18. Qualitative, quantitative and temporal study of cutting agents for cocaine and heroin over 9 years.

    PubMed

    Broséus, Julian; Gentile, Natacha; Bonadio Pont, Federica; Garcia Gongora, Juan Manuel; Gasté, Laëtitia; Esseiva, Pierre

    2015-12-01

    Forensic laboratories mainly focus on the qualification and the quantitation of the illicit drug under analysis as both aspects are used for judiciary purposes. Therefore, information related to cutting agents (adulterants and diluents) detected in illicit drugs is limited in the forensic literature. This article discusses the type and frequency of adulterants and diluents detected in more than 6000 cocaine specimens and 3000 heroin specimens, confiscated in western Switzerland from 2006 to 2014. The results show a homogeneous and quite unchanging adulteration for heroin, while for cocaine it could be characterised as heterogeneous and relatively dynamic. Furthermore, the results indicate that dilution affects more cocaine than heroin. Therefore, the results provided by this study tend to reveal differences between the respective structures of production or distribution of cocaine and heroin. This research seeks to promote the systematic analysis of cutting agents by forensic laboratories. Collecting and processing data related to the presence of cutting agents in illicit drug specimens produces relevant information to understand and to compare the structure of illicit drug markets.

  19. Quantitative analysis of saccadic search strategy

    NASA Astrophysics Data System (ADS)

    Over, E. A. B.

    2007-06-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye movement parameters. Chapter 2 provides a method to quantify a general property of fixation locations. We proposed a quantitative measure based on Voronoi diagrams for the characterization of the uniformity of fixation density. This measure may be thought of as indicating the clustering of fixations. We showed that during a visual search task, a structured (natural) background leads to higher clustering of fixations compared to a homogeneous background. In addition, in natural stimuli, a search task leads to higher clustering of fixations than the instruction to freely view the stimuli. Chapter 3 provides a method to identify the overall field of saccade directions in the viewing area. We extended the Voronoi method of chapter 2 so that it became possible to create vector maps. These maps indicate the preferred saccade direction for each position in the viewing area. Several measures of these vector maps were used to quantify the influence of observer-dependent and stimulus-dependent factors on saccade direction in a search task with natural scenes. The results showed that the influence of stimulus-dependent factors appeared to be larger than the influence of observer-dependent factors. In chapter 4 we showed that the border of the search area played a role in the search strategy. In a search experiment in differently shaped areas we measured that search performance was poorer near the search area luminance edges. Fixation density, however, was higher in the edge region, and saccade direction was mainly along the edges of the search areas. In a target visibility experiment we established that the visibility of targets near a luminance edge is less than the visibility of

  20. Quantitative and temporal definition of the Mla transcriptional regulon during barley-powdery mildew interactions.

    PubMed

    Moscou, Matthew J; Lauter, Nick; Caldo, Rico A; Nettleton, Dan; Wise, Roger P

    2011-06-01

    Barley Mildew resistance locus a (Mla) is a major determinant of immunity to the powdery mildew pathogen, Blumeria graminis f. sp. hordei. Alleles of Mla encode cytoplasmic- and membrane-localized coiled-coil, nucleotide binding site, leucine-rich repeat proteins that mediate resistance when complementary avirulence effectors (AVR(a)) are present in the pathogen. Presence of an appropriate AVR(a) protein triggers nuclear relocalization of MLA, in which MLA binds repressing host transcription factors. Timecourse expression profiles of plants harboring Mla1, Mla6, and Mla12 wild-type alleles versus paired loss-of-function mutants were compared to discover conserved transcriptional targets of MLA and downstream signaling cascades. Pathogen-dependent gene expression was equivalent or stronger in susceptible plants at 20 h after inoculation (HAI) and was attenuated at later timepoints, whereas resistant plants exhibited a time-dependent strengthening of the transcriptional response, increasing in both fold change and the number of genes differentially expressed. Deregulation at 20 HAI implicated 16 HAI as a crucial point in determining the future trajectory of this interaction and was interrogated by quantitative analysis. In total, 28 potential transcriptional targets of the MLA regulon were identified. These candidate targets possess a diverse set of predicted functions, suggesting that multiple pathways are required to mediate the hypersensitive reaction.

  1. Geostatistical analysis of spatial and temporal variations of groundwater level.

    PubMed

    Ahmadi, Seyed Hamid; Sedghamiz, Abbas

    2007-06-01

    Groundwater and water resources management plays a key role in conserving the sustainable conditions in arid and semi-arid regions. Applying management tools which can reveal the critical and hot conditions seems necessary due to some limitations such as labor and funding. In this study, spatial and temporal analysis of monthly groundwater level fluctuations of 39 piezometric wells monitored during 12 years was carried out. Geostatistics which has been introduced as a management and decision tool by many researchers has been applied to reveal the spatial and temporal structure of groundwater level fluctuation. Results showed that a strong spatial and temporal structure existed for groundwater level fluctuations due to very low nugget effects. Spatial analysis showed a strong structure of groundwater level drop across the study area and temporal analysis showed that groundwater level fluctuations have temporal structure. On average, the range of variograms for spatial and temporal analysis was about 9.7 km and 7.2 months, respectively. Ordinary and universal kriging methods with cross-validation were applied to assess the accuracy of the chosen variograms in estimation of the groundwater level drop and groundwater level fluctuations for spatial and temporal scales, respectively. Results of ordinary and universal krigings revealed that groundwater level drop and groundwater level fluctuations were underestimated by 3% and 6% for spatial and temporal analysis, respectively, which are very low and acceptable errors and support the unbiasedness hypothesis of kriging. Although, our results demonstrated that spatial structure was a little bit stronger than temporal structure, however, estimation of groundwater level drop and groundwater level fluctuations could be performed with low uncertainty in both space and time scales. Moreover, the results showed that kriging is a beneficial and capable tool for detecting those critical regions where need more attentions for

  2. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  3. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  4. Variability of Soil Temperature: A Spatial and Temporal Analysis.

    ERIC Educational Resources Information Center

    Walsh, Stephen J.; And Others

    1991-01-01

    Discusses an analysis of the relationship of soil temperatures at 3 depths to various climatic variables along a 200-kilometer transect in west-central Oklahoma. Reports that temperature readings increased from east to west. Concludes that temperature variations were explained by a combination of spatial, temporal, and biophysical factors. (SG)

  5. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  6. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  7. Using quantitative acid-base analysis in the ICU.

    PubMed

    Lloyd, P; Freebairn, R

    2006-03-01

    The quantitative acid-base 'Strong Ion' calculator is a practical application of quantitative acid-base chemistry, as developed by Peter Stewart and Peter Constable. It quantifies the three independent factors that control acidity, calculates the concentration and charge of unmeasured ions, produces a report based on these calculations and displays a Gamblegram depicting measured ionic species. Used together with the medical history, quantitative acid-base analysis has advantages over traditional approaches.

  8. Quantitative Chemical Analysis of Single Cells

    NASA Astrophysics Data System (ADS)

    Heien, Michael L.; Ewing, Andrew G.

    Exocytosis, the fusion of intracellular vesicles with the membrane and subsequent release of vesicular contents, is important in intercellular communication. The release event is a rapid process (milliseconds), hence detection of released chemicals requires a detection scheme that is both sensitive and has rapid temporal dynamics. Electrochemistry at carbon-fiber microelectrodes allows time-resolved exocytosis of electroactive catecholamines to be observed at very low levels. When coupled with constant-potential amperometry, the number of molecules released and the kinetics of quantal release can be determined. The rapid response time (milliseconds) of microelectrodes makes them well suited for monitoring the dynamic process of exocytosis.

  9. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  10. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  11. Effective and efficient analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongnan

    Spatio-temporal data mining, i.e., mining knowledge from large amount of spatio-temporal data, is a highly demanding field because huge amounts of spatio-temporal data have been collected in various applications, ranging from remote sensing, to geographical information systems (GIS), computer cartography, environmental assessment and planning, etc. The collection data far exceeded human's ability to analyze which make it crucial to develop analysis tools. Recent studies on data mining have extended to the scope of data mining from relational and transactional datasets to spatial and temporal datasets. Among the various forms of spatio-temporal data, remote sensing images play an important role, due to the growing wide-spreading of outer space satellites. In this dissertation, we proposed two approaches to analyze the remote sensing data. The first one is about applying association rules mining onto images processing. Each image was divided into a number of image blocks. We built a spatial relationship for these blocks during the dividing process. This made a large number of images into a spatio-temporal dataset since each image was shot in time-series. The second one implemented co-occurrence patterns discovery from these images. The generated patterns represent subsets of spatial features that are located together in space and time. A weather analysis is composed of individual analysis of several meteorological variables. These variables include temperature, pressure, dew point, wind, clouds, visibility and so on. Local-scale models provide detailed analysis and forecasts of meteorological phenomena ranging from a few kilometers to about 100 kilometers in size. When some of above meteorological variables have some special change tendency, some kind of severe weather will happen in most cases. Using the discovery of association rules, we found that some special meteorological variables' changing has tight relation with some severe weather situation that will happen

  12. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  13. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  14. Investigation of Portevin-Le Chatelier band with temporal phase analysis of speckle interferometry

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenyu; Zhang, Qingchuan; Wu, Xiaoping

    2003-04-01

    A new method combining temporal phase analysis with dynamic digital speckle pattern interferometry is proposed to study Portevin-Le Chatelier effect quantitatively. The principle bases on that the phase difference of interference speckle patterns is a time-dependent function related to the object deformation. The interference speckle patterns of specimen are recorded with high sampling rate while PLC effect occurs, and the 2D displacement map of PLC band and its width are obtained by analyzing the displacement of specimen with proposed method.

  15. Proxy-to-proxy calibration: Increasing the temporal resolution of quantitative climate reconstructions

    PubMed Central

    von Gunten, Lucien; D'Andrea, William J.; Bradley, Raymond S.; Huang, Yongsong

    2012-01-01

    High-resolution paleoclimate reconstructions are often restricted by the difficulties of sampling geologic archives in great detail and the analytical costs of processing large numbers of samples. Using sediments from Lake Braya Sø, Greenland, we introduce a new method that provides a quantitative high-resolution paleoclimate record by combining measurements of the alkenone unsaturation index () with non-destructive scanning reflectance spectroscopic measurements in the visible range (VIS-RS). The proxy-to-proxy (PTP) method exploits two distinct calibrations: the in situ calibration of to lake water temperature and the calibration of scanning VIS-RS data to down core data. Using this approach, we produced a quantitative temperature record that is longer and has 5 times higher sampling resolution than the original time series, thereby allowing detection of temperature variability in frequency bands characteristic of the AMO over the past 7,000 years. PMID:22934132

  16. Quantitative imaging with Fucci and mathematics to uncover temporal dynamics of cell cycle progression.

    PubMed

    Saitou, Takashi; Imamura, Takeshi

    2016-01-01

    Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation.

  17. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  18. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  19. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  20. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  1. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  2. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  3. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  4. Statistical analysis of the temporal properties of BL Lacertae

    NASA Astrophysics Data System (ADS)

    Guo, Yu Cheng; Hu, Shao Ming; Li, Yu Tong; Chen, Xu

    2016-08-01

    A comprehensive temporal analysis has been performed on optical light curves of BL Lacertae in the B, V and R bands. The light curves were denoised by Gaussian smoothing and decomposed into individual flares using an exponential profile. The asymmetry, duration, peak flux and equivalent energy output of flares were measured and the frequency distributions presented. Most optical flares of BL Lacertae are highly symmetric, with a weak tendency towards gradual rises and rapid decays. The distribution of flare durations is not random, but consistent with a gamma distribution. Peak fluxes and energy outputs of flares all follow a log-normal distribution. A positive correlation is detected between flare durations and peak fluxes. The temporal properties of BL Lacertae provide evidence of the stochastic magnetohydrodynamic process in the accretion disc and jet.The results presented here can serve as constraints on physical models attempting to interpret blazar variations.

  5. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  6. Temporal SILAC-based quantitative proteomics identifies host factors involved in chikungunya virus replication.

    PubMed

    Treffers, Emmely E; Tas, Ali; Scholte, Florine E M; Van, Myrthe N; Heemskerk, Matthias T; de Ru, Arnoud H; Snijder, Eric J; van Hemert, Martijn J; van Veelen, Peter A

    2015-07-01

    Chikungunya virus (CHIKV) is an arthropod-borne reemerging human pathogen that generally causes a severe persisting arthritis. Since 2005, the virus has infected millions of people during outbreaks in Africa, Indian Ocean Islands, Asia, and South/Central America. Many steps of the replication and expression of CHIKV's 12-kb RNA genome are highly dependent on cellular factors, which thus constitute potential therapeutic targets. SILAC and LC-MS/MS were used to define the temporal dynamics of the cellular response to infection. Using samples harvested at 8, 10, and 12 h postinfection, over 4700 proteins were identified and per time point 2800-3500 proteins could be quantified in both biological replicates. At 8, 10, and 12 h postinfection, 13, 38, and 106 proteins, respectively, were differentially expressed. The majority of these proteins showed decreased abundance. Most subunits of the RNA polymerase II complex were progressively degraded, which likely contributes to the transcriptional host shut-off observed during CHIKV infection. Overexpression of four proteins that were significantly downregulated (Rho family GTPase 3 (Rnd3), DEAD box helicase 56 (DDX56), polo-like kinase 1 (Plk1), and ubiquitin-conjugating enzyme E2C (UbcH10) reduced susceptibility of cells to CHIKV infection, suggesting that infection-induced downregulation of these proteins is beneficial for CHIKV replication. All MS data have been deposited in the ProteomeXchange with identifier PXD001330 (http://proteomecentral.proteomexchange.org/dataset/PXD001330).

  7. Quantitative analysis of cascade impactor samples - revisited

    NASA Astrophysics Data System (ADS)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  8. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  9. Multiscale recurrence analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Marwan, N.; Kurths, J.

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  10. Quantitative and temporal relationships of egg production and sound production by black drum Pogonias cromis.

    PubMed

    Locascio, J V; Burghart, S; Mann, D A

    2012-09-01

    The timing and levels of black drum Pogonias cromis sound production and egg production were compared in an estuarine canal basin of Cape Coral in south-west Florida. Surface plankton samples were collected hourly from 1800 to 0400 on two consecutive nights while continuous acoustic recordings were made simultaneously at five locations in the canal basin. Five pairs of nights were sampled during a part of the spawning season from late January to early April 2006. Pogonias cromis sound production and egg production occurred on all evenings that samples were collected; however, both the timing and levels of sound production were negatively associated with those of egg production. Egg production estimates ranged from a low of 4·8 eggs m(-3) in February to a high of 2889·2 eggs m(-3) in April. Conversely, maximum nightly sound pressure levels (SPL) ranged from a low of 89·5 dB in April to a high of 131·9 dB (re: 1 µPa) in February. The temporal centre of sound production was relatively stable among all nights sampled but spawning occurred earlier in the day as the season progressed and exhibited a strong, positive association with increased water temperature. The negative relationship between the levels of sound production and egg production was unexpected given the usefulness of sound production as a proxy for reproduction on a seasonal basis and may possibly be explained by differences in the spawning potential of the female population in the study area on nights sampled. Egg mortality rates increased throughout the season and were positively associated with densities of hydrozoans and ctenophores. PMID:22957862

  11. Two developmentally temporal quantitative trait loci underlie convergent evolution of increased branchial bone length in sticklebacks.

    PubMed

    Erickson, Priscilla A; Glazer, Andrew M; Cleves, Phillip A; Smith, Alyson S; Miller, Craig T

    2014-08-01

    In convergent evolution, similar phenotypes evolve repeatedly in independent populations, often reflecting adaptation to similar environments. Understanding whether convergent evolution proceeds via similar or different genetic and developmental mechanisms offers insight towards the repeatability and predictability of evolution. Oceanic populations of threespine stickleback fish, Gasterosteus aculeatus, have repeatedly colonized countless freshwater lakes and streams, where new diets lead to morphological adaptations related to feeding. Here, we show that heritable increases in branchial bone length have convergently evolved in two independently derived freshwater stickleback populations. In both populations, an increased bone growth rate in juveniles underlies the convergent adult phenotype, and one population also has a longer cartilage template. Using F2 crosses from these two freshwater populations, we show that two quantitative trait loci (QTL) control branchial bone length at distinct points in development. In both populations, a QTL on chromosome 21 controls bone length throughout juvenile development, and a QTL on chromosome 4 controls bone length only in adults. In addition to these similar developmental profiles, these QTL show similar chromosomal locations in both populations. Our results suggest that sticklebacks have convergently evolved longer branchial bones using similar genetic and developmental programmes in two independently derived populations.

  12. Two developmentally temporal quantitative trait loci underlie convergent evolution of increased branchial bone length in sticklebacks

    PubMed Central

    Erickson, Priscilla A.; Glazer, Andrew M.; Cleves, Phillip A.; Smith, Alyson S.; Miller, Craig T.

    2014-01-01

    In convergent evolution, similar phenotypes evolve repeatedly in independent populations, often reflecting adaptation to similar environments. Understanding whether convergent evolution proceeds via similar or different genetic and developmental mechanisms offers insight towards the repeatability and predictability of evolution. Oceanic populations of threespine stickleback fish, Gasterosteus aculeatus, have repeatedly colonized countless freshwater lakes and streams, where new diets lead to morphological adaptations related to feeding. Here, we show that heritable increases in branchial bone length have convergently evolved in two independently derived freshwater stickleback populations. In both populations, an increased bone growth rate in juveniles underlies the convergent adult phenotype, and one population also has a longer cartilage template. Using F2 crosses from these two freshwater populations, we show that two quantitative trait loci (QTL) control branchial bone length at distinct points in development. In both populations, a QTL on chromosome 21 controls bone length throughout juvenile development, and a QTL on chromosome 4 controls bone length only in adults. In addition to these similar developmental profiles, these QTL show similar chromosomal locations in both populations. Our results suggest that sticklebacks have convergently evolved longer branchial bones using similar genetic and developmental programmes in two independently derived populations. PMID:24966315

  13. Temporal Analysis of Atmospheric Data Using Open Standards

    NASA Astrophysics Data System (ADS)

    Campalani, P.; Beccati, A.; Mantovani, S.; Baumann, P.

    2014-04-01

    The continuous growth of remotely sensed data raises the need for efficient ways of accessing data archives. The classical model of accessing remote sensing (satellite) archives via distribution of large files is increasingly making way for a more dynamic and interactive data service. A challenge, though, is interoperability of such services, in particular when multi-dimensional data and advanced processing are involved. Individually crafted service interfaces typically do not allow substitution and combination of services. Open standards can provide a way forward if they are powerful enough to address both data and processing model. The OGC Web Coverage Service (WCS) is a modular service suite which provides high-level interface definitions for data access, subsetting, filtering, and processing of spatio-temporal raster data. WCS based service interfaces to data archives deliver data in their original semantics useful for further client-side processing, as opposed to the Web Map Service (WMS) (de la Beaujardière, 2006) which performs a pre-rendering into images only useful for display to humans. In this paper we present a case study where the OGC coverage data and service model defines the client/server interface for a climate data service. In particular, we show how flexible temporal analysis can be performed efficiently on massive spatio-temporal coverage objects. This service, which is operational on a several Terabyte data holding, has been established as part of the EarthServer initiative focusing on Big Data in the Earth and Planetary sciences.

  14. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  15. Quantitative BOLD imaging at 3T: Temporal changes in hepatocellular carcinoma and fibrosis following oxygen challenge

    PubMed Central

    Priest, Andrew N.; Bowden, David J.; Wallace, Tess E.; Patterson, Ilse; Graves, Martin J.; Lomas, David J.

    2016-01-01

    Purpose To evaluate the utility of oxygen challenge and report on temporal changes in blood oxygenation level‐dependent (BOLD) contrast in normal liver, hepatocellular carcinoma (HCC) and background fibrosis. Materials and Methods Eleven volunteers (nine male and two female, mean age 33.5, range 27–41 years) and 10 patients (nine male and one female, mean age 68.9, range 56–87 years) with hepatocellular carcinoma on a background of diffuse liver disease were recruited. Imaging was performed on a 3T system using a multiphase, multiecho, fast gradient echo sequence. Oxygen was administered via a Hudson mask after 2 minutes of free‐breathing. Paired t‐tests were performed to determine if the mean pre‐ and post‐O2 differences were statistically significant. Results In patients with liver fibrosis (n = 8) the change in T2* following O2 administration was elevated (0.88 ± 0.582 msec, range 0.03–1.69 msec) and the difference was significant (P = 0.004). The magnitude of the BOLD response in patients with HCC (n = 10) was larger, however the response was more variable (1.07 ± 1.458 msec, range –0.93–3.26 msec), and the difference was borderline significant (P = 0.046). The BOLD response in the volunteer cohort was not significant (P = 0.121, 0.59 ± 1.162 msec, range –0.81–2.44 msec). Conclusion This work demonstrates that the BOLD response following oxygen challenge within cirrhotic liver is consistent with a breakdown in vascular autoregulatory mechanisms. Similarly, the elevated BOLD response within HCC is consistent with the abnormal capillary vasculature within tumors and the arterialization of the blood supply. Our results suggest that oxygen challenge may prove a viable BOLD contrast mechanism in the liver. J. Magn. Reson. Imaging 2016;44:739–744. PMID:26892734

  16. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  17. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  18. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  19. Quantitative signal analysis in pulsed resonant photoacoustics

    NASA Astrophysics Data System (ADS)

    Schäfer, Stefan; Miklós, András; Hess, Peter

    1997-05-01

    The pulsed excitation of acoustic resonances was studied by means of a high- Q photoacoustic resonator with different types of microphone. The signal strength of the first radial mode was calculated by the basic theory as well as by a modeling program, which takes into account the acoustic impedances of the resonator, the acoustic filter system, and the influence of the microphone coupling on the photoacoustic cavity. When the calculated signal strength is used, the high- Q system can be calibrated for trace-gas analysis without a certified gas mixture. The theoretical results were compared with measurements and show good agreement for different microphone configurations. From the measured pressure signal (in pascals per joule), the absorption coefficient of ethylene was calculated; it agreed within 10 with literature values. In addition, a Helmholtz configuration with a highly sensitive 1-in. (2.54-cm) microphone was realized. Although the Q factor was reduced, the sensitivity could be increased by the Helmholtz resonator in the case of pulsed experiments. A maximum sensitivity of the coupled system of 341 mV Pa was achieved.

  20. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  1. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  2. An adaptive gyroscope-based algorithm for temporal gait analysis.

    PubMed

    Greene, Barry R; McGrath, Denise; O'Neill, Ross; O'Donovan, Karol J; Burns, Adrian; Caulfield, Brian

    2010-12-01

    Body-worn kinematic sensors have been widely proposed as the optimal solution for portable, low cost, ambulatory monitoring of gait. This study aims to evaluate an adaptive gyroscope-based algorithm for automated temporal gait analysis using body-worn wireless gyroscopes. Gyroscope data from nine healthy adult subjects performing four walks at four different speeds were then compared against data acquired simultaneously using two force plates and an optical motion capture system. Data from a poliomyelitis patient, exhibiting pathological gait walking with and without the aid of a crutch, were also compared to the force plate. Results show that the mean true error between the adaptive gyroscope algorithm and force plate was -4.5 ± 14.4 ms and 43.4 ± 6.0 ms for IC and TC points, respectively, in healthy subjects. Similarly, the mean true error when data from the polio patient were compared against the force plate was -75.61 ± 27.53 ms and 99.20 ± 46.00 ms for IC and TC points, respectively. A comparison of the present algorithm against temporal gait parameters derived from an optical motion analysis system showed good agreement for nine healthy subjects at four speeds. These results show that the algorithm reported here could constitute the basis of a robust, portable, low-cost system for ambulatory monitoring of gait.

  3. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  4. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi.

  5. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  6. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  7. Quantitative phosphoproteomics unveils temporal dynamics of thrombin signaling in human endothelial cells

    PubMed Central

    van den Biggelaar, Maartje; Hernández-Fernaud, Juan Ramon; van den Eshof, Bart L.; Neilson, Lisa J.; Meijer, Alexander B.; Mertens, Koen

    2014-01-01

    Thrombin is the key serine protease of the coagulation cascade and a potent trigger of protease-activated receptor 1 (PAR1)-mediated platelet aggregation. In recent years, PAR1 has become an appealing target for anticoagulant therapies. However, the inhibitors that have been developed so far increase bleeding risk in patients, likely because they interfere with endogenous PAR1 signaling in the endothelium. Because of its complexity, thrombin-induced signaling in endothelial cells has remained incompletely understood. Here, we have combined stable isotope amino acids in cell culture, affinity-based phosphopeptide enrichment, and high-resolution mass spectrometry and performed a time-resolved analysis of the thrombin-induced signaling in human primary endothelial cells. We identified 2224 thrombin-regulated phosphorylation sites, the majority of which have not been previously related to thrombin. Those sites were localized on proteins that are novel to thrombin signaling, but also on well-known players such as PAR1, Rho-associated kinase 2, phospholipase C, and proteins related to actin cytoskeleton, cell-cell junctions, and Weibel-Palade body release. Our study provides a unique resource of phosphoproteins and phosphorylation sites that may generate novel insights into an intimate understanding of thrombin-mediated PAR signaling and the development of improved PAR1 antagonists that affect platelet but not endothelial cell function. PMID:24501219

  8. Quantitative phosphoproteomics unveils temporal dynamics of thrombin signaling in human endothelial cells.

    PubMed

    van den Biggelaar, Maartje; Hernández-Fernaud, Juan Ramon; van den Eshof, Bart L; Neilson, Lisa J; Meijer, Alexander B; Mertens, Koen; Zanivan, Sara

    2014-03-20

    Thrombin is the key serine protease of the coagulation cascade and a potent trigger of protease-activated receptor 1 (PAR1)-mediated platelet aggregation. In recent years, PAR1 has become an appealing target for anticoagulant therapies. However, the inhibitors that have been developed so far increase bleeding risk in patients, likely because they interfere with endogenous PAR1 signaling in the endothelium. Because of its complexity, thrombin-induced signaling in endothelial cells has remained incompletely understood. Here, we have combined stable isotope amino acids in cell culture, affinity-based phosphopeptide enrichment, and high-resolution mass spectrometry and performed a time-resolved analysis of the thrombin-induced signaling in human primary endothelial cells. We identified 2224 thrombin-regulated phosphorylation sites, the majority of which have not been previously related to thrombin. Those sites were localized on proteins that are novel to thrombin signaling, but also on well-known players such as PAR1, Rho-associated kinase 2, phospholipase C, and proteins related to actin cytoskeleton, cell-cell junctions, and Weibel-Palade body release. Our study provides a unique resource of phosphoproteins and phosphorylation sites that may generate novel insights into an intimate understanding of thrombin-mediated PAR signaling and the development of improved PAR1 antagonists that affect platelet but not endothelial cell function. PMID:24501219

  9. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  10. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  11. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  12. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  13. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  14. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  15. Cognitive control for language switching in bilinguals: A quantitative meta-analysis of functional neuroimaging studies.

    PubMed

    Luk, Gigi; Green, David W; Abutalebi, Jubin; Grady, Cheryl

    2011-11-17

    In a quantitative meta-analysis, using the activation likelihood estimation method, we examined the neural regions involved in bilingual cognitive control, particularly when engaging in switching between languages. The purpose of this study was to evaluate the bilingual cognitive control model based on a qualitative analysis [Abutalebi, J., & Green, D. W. (2008). Control mechanisms in bilingual language production: Neural evidence from language switching studies. Language and Cognitive Processes, 23, 557-582.]. After reviewing 128 peer-reviewed articles, ten neuroimaging studies met our inclusion criteria and in each study, bilinguals switched between languages in response to cues. We isolated regions involved in voluntary language switching, by including reported contrasts between the switching conditions and high level baseline conditions involving similar tasks but requiring the use of only one language. Eight brain regions showed significant and reliable activation: left inferior frontal gyrus, left middle temporal gyrus, left middle frontal gyrus, right precentral gyrus, right superior temporal gyrus, midline pre-SMA and bilateral caudate nuclei. This quantitative result is consistent with bilingual aphasia studies that report switching deficits associated with lesions to the caudate nuclei or prefrontal cortex. It also extends the previously reported qualitative model. We discuss the implications of the findings for accounts of bilingual cognitive control.

  16. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  17. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  18. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  19. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  20. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  1. Exploratory analysis of the spatio-temporal deformation of the myocardium during systole from tagged MRI.

    PubMed

    Clarysse, Patrick; Han, Meimei; Croisille, Pierre; Magnin, Isabelle E

    2002-11-01

    Myocardial contractile function is, with perfusion, one of the main affected factors in ischemic heart diseases. In this paper, we propose an original framework based on functional data analysis for the quantitative study of spatio-temporal parameters related to the myocardial contraction mechanics. The mechanical strains in the left-ventricular (LV) myocardium are computed from tagged magnetic resonance imaging cardiac sequences. A statistical functional model of the normal contractile function of the LV is build from the study of eight examinations on healthy subjects. We show that it is possible to detect abnormal strain patterns comparatively to this model, by generating distance maps at rest and under pharmacological stress. We demonstrate the consistency of the results for the circumferential deformation parameter on healthy and pathological data sets. PMID:12450363

  2. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  3. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  4. A Quantitative 3D Motility Analysis of Trypanosoma brucei by Use of Digital In-line Holographic Microscopy

    PubMed Central

    Weiße, Sebastian; Heddergott, Niko; Heydt, Matthias; Pflästerer, Daniel; Maier, Timo; Haraszti, Tamás; Grunze, Michael; Engstler, Markus; Rosenhahn, Axel

    2012-01-01

    We present a quantitative 3D analysis of the motility of the blood parasite Trypanosoma brucei. Digital in-line holographic microscopy has been used to track single cells with high temporal and spatial accuracy to obtain quantitative data on their behavior. Comparing bloodstream form and insect form trypanosomes as well as mutant and wildtype cells under varying external conditions we were able to derive a general two-state-run-and-tumble-model for trypanosome motility. Differences in the motility of distinct strains indicate that adaption of the trypanosomes to their natural environments involves a change in their mode of swimming. PMID:22629379

  5. Local and global analysis of endocytic patch dynamics in fission yeast using a new “temporal superresolution” realignment method

    PubMed Central

    Berro, Julien; Pollard, Thomas D.

    2014-01-01

    Quantitative microscopy is a valuable tool for inferring molecular mechanisms of cellular processes such as clathrin-mediated endocytosis, but, for quantitative microscopy to reach its potential, both data collection and analysis needed improvement. We introduce new tools to track and count endocytic patches in fission yeast to increase the quality of the data extracted from quantitative microscopy movies. We present a universal method to achieve “temporal superresolution” by aligning temporal data sets with higher temporal resolution than the measurement intervals. These methods allowed us to extract new information about endocytic actin patches in wild-type cells from measurements of the fluorescence of fimbrin-mEGFP. We show that the time course of actin assembly and disassembly varies <600 ms between patches. Actin polymerizes during vesicle formation, but we show that polymerization does not participate in vesicle movement other than to limit the complex diffusive motions of newly formed endocytic vesicles, which move faster as the surrounding actin meshwork decreases in size over time. Our methods also show that the number of patches in fission yeast is proportional to cell length and that the variability in the repartition of patches between the tips of interphase cells has been underestimated. PMID:25143395

  6. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored.

  7. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  8. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  9. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  10. Temporal and spatial analysis of solar signatures in cloud cover

    NASA Astrophysics Data System (ADS)

    Voiculescu, M.; Usoskin, I.

    2012-12-01

    The persistence of solar signals in cloud cover is analyzed for the time interval, 1984 - 2009, covering two full solar cycles, 22 and 23. A spatial and temporal investigation of the response of low, middle and high cloud data to cosmic ray induced ionization (CRII) and UV irradiance (UVI) is performed in terms of coherence analysis of the two signals for various regions of the globe where correlation is observed between clouds and solar proxies. For some key geographical regions the response of clouds to UVI and CRII is persistent over the entire time interval, which indicates a real link. In other regions the relation is not consistent, being intermittent or out of phase, suggesting that some correlations are not real. However, constant in-phase or anti-phase relationship between clouds and solar proxies for some regions are observed. Low cloud cover correlates to both UVI and CRII, middle clouds seem to be related to UVI while high clouds respond to CRII. Correlation and coherence analysis cannot give definitive answers to questions related to solar effects on clouds but could, nonetheless, pinpoint some possible solar effects on climate and could suggests directions for future research.

  11. Temporal instability analysis of inviscid compound jets falling under gravity

    NASA Astrophysics Data System (ADS)

    Mohsin, Muhammad; Uddin, Jamal; Decent, Stephen P.; Afzaal, Muhammad F.

    2013-01-01

    Compound liquid jets can be used in a variety of industrial applications ranging from capsule production in pharmaceutics to enhance printing methods in ink-jet printing. An appreciation of how instability along compound jets can lead to breakup and droplet formation is thus critical in many fields in science and engineering. In this paper, we perform a theoretical analysis to examine the instability of an axisymmetric inviscid compound liquid jet which falls vertically under the influence of gravity. We use a long-wavelength, slender-jet asymptotic expansion to reduce the governing equations of the problem into a set of one-dimensional partial differential equations, which describe the evolution of the leading-order axial velocity of the jet as well as the radii of both the inner and the outer interfaces. We first determine the steady-state solutions of the one-dimensional model equations and then we perform a linear temporal instability analysis to obtain a dispersion relation, which gives us useful information about the maximum growth rate and the maximum wavenumber of the imposed wave-like disturbance. We use our results to estimate the location and qualitative nature of breakup and then compare our results with numerical simulations.

  12. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-01-01

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections. PMID:27548134

  13. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  14. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  15. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  16. The 'when' and 'where' of semantic coding in the anterior temporal lobe: Temporal representational similarity analysis of electrocorticogram data.

    PubMed

    Chen, Y; Shimotake, A; Matsumoto, R; Kunieda, T; Kikuchi, T; Miyamoto, S; Fukuyama, H; Takahashi, R; Ikeda, A; Lambon Ralph, M A

    2016-06-01

    Electrocorticograms (ECoG) provide a unique opportunity to monitor neural activity directly at the cortical surface. Ten patients with subdural electrodes covering ventral and lateral anterior temporal regions (ATL) performed a picture naming task. Temporal representational similarity analysis (RSA) was used, for the first time, to compare spatio-temporal neural patterns from the ATL surface with pre-defined theoretical models. The results indicate that the neural activity in the ventral subregion of the ATL codes semantic representations from 250 msec after picture onset. The observed activation similarity was not related to the visual similarity of the pictures or the phonological similarity of their names. In keeping with convergent evidence for the importance of the ATL in semantic processing, these results provide the first direct evidence of semantic coding from the surface of the ventral ATL and its time-course. PMID:27085891

  17. The 'when' and 'where' of semantic coding in the anterior temporal lobe: Temporal representational similarity analysis of electrocorticogram data.

    PubMed

    Chen, Y; Shimotake, A; Matsumoto, R; Kunieda, T; Kikuchi, T; Miyamoto, S; Fukuyama, H; Takahashi, R; Ikeda, A; Lambon Ralph, M A

    2016-06-01

    Electrocorticograms (ECoG) provide a unique opportunity to monitor neural activity directly at the cortical surface. Ten patients with subdural electrodes covering ventral and lateral anterior temporal regions (ATL) performed a picture naming task. Temporal representational similarity analysis (RSA) was used, for the first time, to compare spatio-temporal neural patterns from the ATL surface with pre-defined theoretical models. The results indicate that the neural activity in the ventral subregion of the ATL codes semantic representations from 250 msec after picture onset. The observed activation similarity was not related to the visual similarity of the pictures or the phonological similarity of their names. In keeping with convergent evidence for the importance of the ATL in semantic processing, these results provide the first direct evidence of semantic coding from the surface of the ventral ATL and its time-course.

  18. Temporal Expectation and Information Processing: A Model-Based Analysis

    ERIC Educational Resources Information Center

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  19. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  20. a Temporal and Spatial Analysis of Urban Heat Island in Basin City Utilizing Remote Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Chang, Hsiao-Tung

    2016-06-01

    Urban Heat Island (UHI) has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city's UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST) at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls "sinking heat island". From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  1. Quantitative Assessment of Spatio-Temporal Desertification Rates in Azerbaijan during Using Timeseries Landsat-8 Satellite Images

    NASA Astrophysics Data System (ADS)

    Bayramov, Emil; Mammadov, Ramiz

    2016-07-01

    The main goals of this research are the object-based landcover classification of LANDSAT-8 multi-spectral satellite images in 2014 and 2015, quantification of Normalized Difference Vegetation Indices (NDVI) rates within the land-cover classes, change detection analysis between the NDVIs derived from multi-temporal LANDSAT-8 satellite images and the quantification of those changes within the land-cover classes and detection of changes between land-cover classes. The object-based classification accuracy of the land-cover classes was validated through the standard confusion matrix which revealed 80 % of land-cover classification accuracy for both years. The analysis revealed that the area of agricultural lands increased from 30911 sq. km. in 2014 to 31999 sq. km. in 2015. The area of barelands increased from 3933 sq. km. in 2014 to 4187 sq. km. in 2015. The area of forests increased from 8211 sq. km. in 2014 to 9175 sq. km. in 2015. The area of grasslands decreased from 27176 sq. km. in 2014 to 23294 sq. km. in 2015. The area of urban areas increased from 12479 sq. km. in 2014 to 12956 sq. km. in 2015. The decrease in the area of grasslands was mainly explained by the landuse shifts of grasslands to agricultural and urban lands. The quantification of low and medium NDVI rates revealed the increase within the agricultural, urban and forest land-cover classes in 2015. However, the high NDVI rates within agricultural, urban and forest land-cover classes in 2015 revealed to be lower relative to 2014. The change detection analysis between landscover types of 2014 and 2015 allowed to determine that 7740 sq. km. of grasslands shifted to agricultural landcover type whereas 5442sq. km. of agricultural lands shifted to rangelands. This means that the spatio-temporal patters of agricultural activities occurred in Azerbaijan because some of the areas reduced agricultural activities whereas some of them changed their landuse type to agricultural. Based on the achieved results, it

  2. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  3. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  4. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  5. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis

    PubMed Central

    Morelli, Sylvia A.; Sacchet, Matthew D.; Zaki, Jamil

    2015-01-01

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  6. Right ventricular strain analysis from three-dimensional echocardiography by using temporally diffeomorphic motion estimation

    PubMed Central

    Zhu, Meihua; Ashraf, Muhammad; Broberg, Craig S.; Sahn, David J.; Song, Xubo

    2014-01-01

    Purpose: Quantitative analysis of right ventricle (RV) motion is important for study of the mechanism of congenital and acquired diseases. Unlike left ventricle (LV), motion estimation of RV is more difficult because of its complex shape and thin myocardium. Although attempts of finite element models on MR images and speckle tracking on echocardiography have shown promising results on RV strain analysis, these methods can be improved since the temporal smoothness of the motion is not considered. Methods: The authors have proposed a temporally diffeomorphic motion estimation method in which a spatiotemporal transformation is estimated by optimization of a registration energy functional of the velocity field in their earlier work. The proposed motion estimation method is a fully automatic process for general image sequences. The authors apply the method by combining with a semiautomatic myocardium segmentation method to the RV strain analysis of three-dimensional (3D) echocardiographic sequences of five open-chest pigs under different steady states. Results: The authors compare the peak two-point strains derived by their method with those estimated from the sonomicrometry, the results show that they have high correlation. The motion of the right ventricular free wall is studied by using segmental strains. The baseline sequence results show that the segmental strains in their methods are consistent with results obtained by other image modalities such as MRI. The image sequences of pacing steady states show that segments with the largest strain variation coincide with the pacing sites. Conclusions: The high correlation of the peak two-point strains of their method and sonomicrometry under different steady states demonstrates that their RV motion estimation has high accuracy. The closeness of the segmental strain of their method to those from MRI shows the feasibility of their method in the study of RV function by using 3D echocardiography. The strain analysis of the

  7. Multi-Temporal Analysis of WWII Reconnaissance Photos

    NASA Astrophysics Data System (ADS)

    Meixner, P.; Eckstein, M.

    2016-06-01

    There are millions of aerial photographs from the period of the Second Wold War available in the Allied archives, obtained by aerial photo reconnaissance, covering most of today's European countries. They are spanning the time from 1938 until the end of the war and even beyond. Photo reconnaissance provided intelligence information for the Allied headquarters and accompanied the bombing offensive against the German homeland and the occupied territories. One of the initial principal targets in Bohemia were the synthetized fuel works STW AG (Sudetenländische Treibstoffwerke AG) in Zaluzi (formerly Maltheuren) near Most (formerly Brück), Czech Republic. The STW AG synthetized fuel plant was not only subject to bombing raids, but a subject to quite intensive photo reconnaissance, too - long before the start of the bombing campaign. With a multi-temporal analysis of the available imagery from international archives we will demonstrate the factory build-up during 1942 and 1943, the effects of the bombing raids in 1944 and the struggle to keep the plant working in the last year of the war. Furthermore we would like to show the impact the bombings have today, in form of potential unexploded ordnance in the adjacent area of the open cast mines.

  8. Temporal trend analysis of RCRA groundwater monitoring data

    SciTech Connect

    Need, E.A. )

    1994-04-01

    Statistical analysis of RCRA groundwater monitoring data at a uranium hexafluoride processing facility showed a statistically significant increase in the concentration of gross beta activity in monitor wells downgradient of surface impounds storing calcium fluoride sludge and high pH water. Because evidence of leakage had not been detected in lysimeters installed beneath the impounds, the operator sought an evaluation of other potential causes of the result, including natural variability. This study determined that all five data sets showed either long-term excursionary (spike-like), or seasonal forms of temporal variation. Gross beta had an upward long-term trend with multiple excursions that almost appeared to be seasonal. Gross alpha had an upward long-term trend with multiple excursions that were clearly not seasonal. Specific conductance had both upward and downward long-term trends but no other variations. pH had a downward long-term trend with multiple excursions that were clearly not seasonal. Fluoride had a downward long-term trend without excursions but with clear seasonal variations. The gross beta result that appeared to be a significant change was a spike event on the upward long-term trend.

  9. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  10. Temporal dynamics and subpopulation analysis of Theileria orientalis genotypes in cattle.

    PubMed

    Jenkins, C; Micallef, M; Alex, S M; Collins, D; Djordjevic, S P; Bogema, D R

    2015-06-01

    In Australia, outbreaks of clinical theileriosis caused by Theileria orientalis have been largely associated with the Ikeda genotype which can occur as a sole infection, or more commonly, as a mixture of genotypes. The most prevalent genotype, Chitose, frequently co-occurs with type Ikeda, however the role of this genotype in clinical disease has not been clearly established. Furthermore, the dynamics of individual genotypes in field infection of cattle have not been examined. In this study we developed quantitative PCR (qPCR) and genotyping methods to examine the role of the Chitose genotype in clinical disease and to investigate the temporal dynamics of T. orientalis Ikeda, Chitose and Buffeli genotypes in naïve animals introduced to a T. orientalis-endemic area. Analysis of the major piroplasm surface protein (MPSP) genes of Chitose isolates revealed the presence of two distinct phylogenetic clusters, Chitose A and Chitose B. A genotyping assay aimed at determining Chitose A/B allele frequency revealed that the Chitose A phylogenetic cluster is strongly associated with clinical disease but nearly always co-occurs with the Ikeda genotype. qPCR revealed that the Chitose genotype (particularly Chitose A), undergoes temporal switching in conjunction with the Ikeda genotype and contributes substantially to the overall parasite burden. The benign Buffeli genotype can also undergo temporal switching but levels of this genotype appear to remain low relative to the Ikeda and Chitose types. Interplay between vector and host immunological factors is presumed to be critical to the population dynamics observed in this study. Genotypic switching likely contributes to the persistence of T. orientalis in the host.

  11. Quantitative analysis of agricultural land use change in China

    NASA Astrophysics Data System (ADS)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  12. Quantitative analysis of chromosome condensation in fission yeast.

    PubMed

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota; Haering, Christian H

    2013-03-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote.

  13. Fluorescent foci quantitation for high-throughput analysis

    PubMed Central

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  14. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  15. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  16. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  17. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  18. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  19. Reference standard and statistical model for intersite and temporal comparisons of CT attenuation in a multicenter quantitative lung study

    PubMed Central

    Sieren, J. P.; Newell, J. D.; Judy, P. F.; Lynch, D. A.; Chan, K. S.; Guo, J.; Hoffman, E. A.

    2012-01-01

    Purpose: The purpose of this study was to detect and analyze anomalies between a large number of computed tomography (CT) scanners, tracked over time, utilized to collect human pulmonary CT data for a national multicenter study: chronic obstructive pulmonary disease genetic epidemiology study (COPDGene). Methods: A custom designed CT reference standard “Test Object” has been developed to evaluate the relevant differences in CT attenuation between CT scanners in COPDGene. The materials used in the Test Object to assess CT scanner accuracy and precision included lung equivalent foam (−856 HU), internal air (−1000 HU), water (0 HU), and acrylic (120 HU). Nineteen examples of the Test Object were manufactured. Initially, all Test Objects were scanned on the same CT scanner before the Test Objects were sent to the 20 specific sites and 42 individual CT scanners that were used in the study. The Test Objects were scanned over 17 months while the COPDGene study continued to recruit subjects. A mixed linear effect statistical analysis of the CT scans on the 19 Test Objects was performed. The statistical model reflected influence of reconstruction kernels, tube current, individual Test Objects, CT scanner models, and temporal consistency on CT attenuation. Results: Depending on the Test Object material, there were significant differences between reconstruction kernels, tube current, individual Test Objects, CT scanner models, and temporal consistency. The two Test Object materials of most interest were lung equivalent foam and internal air. With lung equivalent foam, there were significant (p < 0.05) differences between the Siemens B31 (−856.6, ±0.82; mean ± SE) and the GE Standard (−856.6 ± 0.83) reconstruction kernel relative to the Siemens B35 reference standard (−852.5 ± 1.4). Comparing lung equivalent foam attenuation there were also significant differences between CT scanner models (p < 0.01), tube current (p < 0.005), and in temporal consistency (p

  20. Qualitative and temporal reasoning in engine behavior analysis

    NASA Technical Reports Server (NTRS)

    Dietz, W. E.; Stamps, M. E.; Ali, M.

    1987-01-01

    Numerical simulation models, engine experts, and experimental data are used to generate qualitative and temporal representations of abnormal engine behavior. Engine parameters monitored during operation are used to generate qualitative and temporal representations of actual engine behavior. Similarities between the representations of failure scenarios and the actual engine behavior are used to diagnose fault conditions which have already occurred, or are about to occur; to increase the surveillance by the monitoring system of relevant engine parameters; and to predict likely future engine behavior.

  1. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  2. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  3. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  4. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  5. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  6. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  7. [Quantitative analysis for mast cells in obstructive sialadenitis].

    PubMed

    Diao, G X

    1993-03-01

    Quantitative analysis for mast cells in 27 cases of obstructive sialadenitis, 12 cases of approximate normal salivary gland tissues and 5 cases of lymphoepithelial lesion of salivary glands shows that the number of mast cells is slightly increased with the increase of gravity-grade of obstructive sialadenitis and this is closely related to fibrosis of salivary glands and infiltration grade of inflammation cells (dominated by lymphocyte cells), whereas not closely relating to the age change of patients. For the cases of benign lymphoepithelial lesion of salivary glands with malignant changes despite of malignant lymphoma or squamous cell carcinoma the numbers of mast cells are obviously decreased.

  8. Altered resting-state functional activity in posttraumatic stress disorder: A quantitative meta-analysis

    PubMed Central

    Wang, Ting; Liu, Jia; Zhang, Junran; Zhan, Wang; Li, Lei; Wu, Min; Huang, Hua; Zhu, Hongyan; Kemp, Graham J.; Gong, Qiyong

    2016-01-01

    Many functional neuroimaging studies have reported differential patterns of spontaneous brain activity in posttraumatic stress disorder (PTSD), but the findings are inconsistent and have not so far been quantitatively reviewed. The present study set out to determine consistent, specific regional brain activity alterations in PTSD, using the Effect Size Signed Differential Mapping technique to conduct a quantitative meta-analysis of resting-state functional neuroimaging studies of PTSD that used either a non-trauma (NTC) or a trauma-exposed (TEC) comparison control group. Fifteen functional neuroimaging studies were included, comparing 286 PTSDs, 203 TECs and 155 NTCs. Compared with NTC, PTSD patients showed hyperactivity in the right anterior insula and bilateral cerebellum, and hypoactivity in the dorsal medial prefrontal cortex (mPFC); compared with TEC, PTSD showed hyperactivity in the ventral mPFC. The pooled meta-analysis showed hypoactivity in the posterior insula, superior temporal, and Heschl’s gyrus in PTSD. Additionally, subgroup meta-analysis (non-medicated subjects vs. NTC) identified abnormal activation in the prefrontal-limbic system. In meta-regression analyses, mean illness duration was positively associated with activity in the right cerebellum (PTSD vs. NTC), and illness severity was negatively associated with activity in the right lingual gyrus (PTSD vs. TEC). PMID:27251865

  9. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  10. Spatial and Temporal Variation in Selection of Genes Associated with Pearl Millet Varietal Quantitative Traits In situ

    PubMed Central

    Mariac, Cédric; Ousseini, Issaka S.; Alio, Abdel-Kader; Jugdé, Hélène; Pham, Jean-Louis; Bezançon, Gilles; Ronfort, Joelle; Descroix, Luc; Vigouroux, Yves

    2016-01-01

    Ongoing global climate changes imply new challenges for agriculture. Whether plants and crops can adapt to such rapid changes is still a widely debated question. We previously showed adaptation in the form of earlier flowering in pearl millet at the scale of a whole country over three decades. However, this analysis did not deal with variability of year to year selection. To understand and possibly manage plant and crop adaptation, we need more knowledge of how selection acts in situ. Is selection gradual, abrupt, and does it vary in space and over time? In the present study, we tracked the evolution of allele frequency in two genes associated with pearl millet phenotypic variation in situ. We sampled 17 populations of cultivated pearl millet over a period of 2 years. We tracked changes in allele frequencies in these populations by genotyping more than seven thousand individuals. We demonstrate that several allele frequencies changes are compatible with selection, by correcting allele frequency changes associated with genetic drift. We found marked variation in allele frequencies from year to year, suggesting a variable selection effect in space and over time. We estimated the strength of selection associated with variations in allele frequency. Our results suggest that the polymorphism maintained at the genes we studied is partially explained by the spatial and temporal variability of selection. In response to environmental changes, traditional pearl millet varieties could rapidly adapt thanks to this available functional variability. PMID:27507986

  11. Temporal and multiple quantitative trait loci analyses of resistance to bacterial wilt in tomato permit the resolution of linked loci.

    PubMed

    Mangin, B; Thoquet, P; Olivier, J; Grimsley, N H

    1999-03-01

    Ralstonia solanacearum is a soil-borne bacterium that causes the serious disease known as bacterial wilt in many plant species. In tomato, several QTL controlling resistance have been found, but in different studies, markers spanning a large region of chromosome 6 showed strong association with the resistance. By using two different approaches to analyze the data from a field test F3 population, we show that at least two separate loci approximately 30 cM apart on this chromosome are most likely involved in the resistance. First, a temporal analysis of the progression of symptoms reveals a distal locus early in the development of the disease. As the disease progresses, the maximum LOD peak observed shifts toward the proximal end of the chromosome, obscuring the distal locus. Second, although classical interval mapping could only detect the presence of one locus, a statistical "two-QTL model" test, specifically adapted for the resolution of linked QTL, strongly supported the hypothesis for the presence of two loci. These results are discussed in the context of current molecular knowledge about disease resistance genes on chromosome 6 and observations made by tomato breeders during the production of bacterial wilt-resistant varieties. PMID:10049932

  12. Temporal and multiple quantitative trait loci analyses of resistance to bacterial wilt in tomato permit the resolution of linked loci.

    PubMed Central

    Mangin, B; Thoquet, P; Olivier, J; Grimsley, N H

    1999-01-01

    Ralstonia solanacearum is a soil-borne bacterium that causes the serious disease known as bacterial wilt in many plant species. In tomato, several QTL controlling resistance have been found, but in different studies, markers spanning a large region of chromosome 6 showed strong association with the resistance. By using two different approaches to analyze the data from a field test F3 population, we show that at least two separate loci approximately 30 cM apart on this chromosome are most likely involved in the resistance. First, a temporal analysis of the progression of symptoms reveals a distal locus early in the development of the disease. As the disease progresses, the maximum LOD peak observed shifts toward the proximal end of the chromosome, obscuring the distal locus. Second, although classical interval mapping could only detect the presence of one locus, a statistical "two-QTL model" test, specifically adapted for the resolution of linked QTL, strongly supported the hypothesis for the presence of two loci. These results are discussed in the context of current molecular knowledge about disease resistance genes on chromosome 6 and observations made by tomato breeders during the production of bacterial wilt-resistant varieties. PMID:10049932

  13. Spatial and Temporal Variation in Selection of Genes Associated with Pearl Millet Varietal Quantitative Traits In situ.

    PubMed

    Mariac, Cédric; Ousseini, Issaka S; Alio, Abdel-Kader; Jugdé, Hélène; Pham, Jean-Louis; Bezançon, Gilles; Ronfort, Joelle; Descroix, Luc; Vigouroux, Yves

    2016-01-01

    Ongoing global climate changes imply new challenges for agriculture. Whether plants and crops can adapt to such rapid changes is still a widely debated question. We previously showed adaptation in the form of earlier flowering in pearl millet at the scale of a whole country over three decades. However, this analysis did not deal with variability of year to year selection. To understand and possibly manage plant and crop adaptation, we need more knowledge of how selection acts in situ. Is selection gradual, abrupt, and does it vary in space and over time? In the present study, we tracked the evolution of allele frequency in two genes associated with pearl millet phenotypic variation in situ. We sampled 17 populations of cultivated pearl millet over a period of 2 years. We tracked changes in allele frequencies in these populations by genotyping more than seven thousand individuals. We demonstrate that several allele frequencies changes are compatible with selection, by correcting allele frequency changes associated with genetic drift. We found marked variation in allele frequencies from year to year, suggesting a variable selection effect in space and over time. We estimated the strength of selection associated with variations in allele frequency. Our results suggest that the polymorphism maintained at the genes we studied is partially explained by the spatial and temporal variability of selection. In response to environmental changes, traditional pearl millet varieties could rapidly adapt thanks to this available functional variability. PMID:27507986

  14. Drought analysis in Switzerland: spatial and temporal features

    NASA Astrophysics Data System (ADS)

    Di Franca, Gaetano; Molnar, Peter; Burlando, Paolo; Bonaccorso, Brunella; Cancelliere, Antonino

    2015-04-01

    Drought as a natural hazard may have negative impacts even in regions characterized by a general abundance of water resources. The Swiss Alpine region has experienced several extreme meteorological events (heat waves, droughts) during the last fifty years that have caused human and economic losses. Though Swiss climate is far from arid or semi-arid, natural climatic variability, exacerbated by climate change, could lead to more severe impacts from naturally occurring meteorological droughts (i.e. lack or significant reduction of precipitation) in the future. In this work, spatial and temporal features of meteorological droughts in Switzerland have been explored by the identification and probabilistic characterization of historic drought events on gridded precipitation data during the period 1961-2012. The run method has been applied to both monthly and annual precipitation time series to probabilistically characterize drought occurrences as well as to analyze their spatial variability. Spatial features have also been investigated by means of Principal Components Analysis (PCA) applied to Standardized Precipitation Index (SPI) series at 3, 6, and 12-month aggregated time scale, in order to detect areas with distinct precipitation patterns, accounting for seasonality throughout year and including both wet and dry conditions. Furthermore, a probabilistic analysis of drought areal extent has been carried out by applying an SPI-based procedure to derive Severity-Area-Frequency (SAF) curves. The application of run method reveals that Ticino and Valais are the most potentially drought-prone Swiss regions, since accumulated deficit precipitation is significantly higher (up to two times) than in the rest of the country. Inspection of SPI series reveals many events in which precipitation has shown significant anomalies from the average in the period 1961-2012 at the investigated time scales. Anomalies in rainfall seem to exhibit high spatial correlation, showing uniform sub

  15. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  16. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  17. Match analysis and temporal patterns of fatigue in rugby sevens.

    PubMed

    Granatelli, Giampietro; Gabbett, Tim J; Briotti, Gianluca; Padulo, Johnny; Buglione, Antonio; D'Ottavio, Stefano; Ruscello, Bruno M

    2014-03-01

    Rugby sevens is a rapidly growing sport. Match analysis is increasingly being used by sport scientists and coaches to improve the understanding of the physical demands of this sport. This study investigated the physical and physiological demands of elite men's rugby sevens, with special reference to the temporal patterns of fatigue during match play. Nine players, 4 backs and 5 forwards (age 25.1 ± 3.1 years) participated during 2 "Roma 7" international tournaments (2010 and 2011). All the players were at the professional level in the highest Italian rugby union, and 5 of these players also competed at the international level. During the matches (n = 15), the players were filmed to assess game performance. Global positioning system, heart rate (HR), and blood lactate (BLa) concentration data were measured and analyzed. The mean total distance covered throughout matches was 1,221 ± 118 m (first half = 643 ± 70 m and second half = 578 ± 77 m; with a decrease of 11.2%, p > 0.05, Effect Size [ES] = 0.29). The players achieved 88.3 ± 4.2 and 87.7 ± 3.4% of the HRmax during the first and second halves, respectively. The BLa for the first and second halves was 3.9 ± 0.9 and 11.2 ± 1.4 mmol·L, respectively. The decreases in performance occurred consistently in the final 3 minutes of the matches (-40.5% in the distance covered per minute). The difference found in relation to the playing position, although not statistically significant (p = 0.11), showed a large ES (η = 0.20), suggesting possible practical implications. These results demonstrate that rugby sevens is a demanding sport that places stress on both the anaerobic glycolytic and aerobic oxidative energy systems. Strength and conditioning programs designed to train these energy pathways may prevent fatigue-induced reductions in physical performance.

  18. Temporal Analysis of Remotely Sensed Precipitation Products for Hydrological Applications

    NASA Astrophysics Data System (ADS)

    Tobin, K. J.; Bennett, M. E.

    2011-12-01

    No study has systematically evaluated streamflow modeling between monthly and daily timescales. This study examines streamflow from eight watersheds across the United States where five different precipitation products were used as primary input into the Soil and Water Assessment Tool to generate simulated streamflow. Timescales examined include monthly, dekad (10 day), pentad (5 day), triad (3 day), and daily. The eight basins studied are the San Pedro (Arizona); Cimarron (north-central Oklahoma); mid-Nueces (south Texas); mid-Rio Grande (south Texas and northern Mexico), Yocano (northern Mississippi); Alapaha (south Georgia); Upper Tar (North Carolina) and mid-St. Francis (eastern Arkansas). The precipitation products used to drive simulations include rain gauge, NWS Multisensor Precipitation Estimator, Tropical Rainfall Measurement Mission, Multi-Satellite (TRMM) Precipitation Analysis, TRMM 3B42-V6, and Climate Prediction Center Morphing Method (CMORPH). Understanding how streamflow varies at sub-monthly timescales is important because there are a host of hydrological applications such a flood forecast guidance and reservoir inflow forecasts that reside in a temporal domain between monthly and daily timescales. The major finding of this study is the quantification of a strong positive correlation between performance metrics and time step at which model performance deteriorates. Basically, better performing simulations, with higher Nash-Sutcliffe values of 0.80 and above can support modeling at finer timescales to at least daily and perhaps beyond into the sub-daily realm. These findings are significant in that they clearly document the ability of SWAT to support modeling at sub-monthly time steps, which is beyond the capability for which SWAT was initially designed.

  19. Application of a temporal reasoning framework tool in analysis of medical device adverse events.

    PubMed

    Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.

  20. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  1. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs. PMID:26320797

  2. Quantitative analysis of echogenicity for patients with thyroid nodules

    PubMed Central

    Wu, Ming-Hsun; Chen, Chiung-Nien; Chen, Kuen-Yuan; Ho, Ming-Chih; Tai, Hao-Chih; Wang, Yu-Hsin; Chen, Argon; Chang, King-Jen

    2016-01-01

    Hypoechogenicity has been described qualitatively and is potentially subject to intra- and inter-observer variability. The aim of this study was to clarify whether quantitative echoic indexes (EIs) are useful for the detection of malignant thyroid nodules. Overall, 333 participants with 411 nodules were included in the final analysis. Quantification of echogenicity was performed using commercial software (AmCAD-UT; AmCad BioMed, Taiwan). The coordinates of three defined regions, the nodule, thyroid parenchyma, and strap muscle regions, were recorded in the database separately for subsequent analysis. And the results showed that ultrasound echogenicity (US-E), as assessed by clinicians, defined hypoechogenicity as an independent factor for malignancy. The EI, adjusted EI (EIN-T; EIN-M) and automatic EI(N-R)/R values between benign and malignant nodules were all significantly different, with lower values for malignant nodules. All of the EIs showed similar percentages of sensitivity and specificity and had better accuracies than US-E. In conclusion, the proposed quantitative EI seems more promising to constitute an important advancement than the conventional qualitative US-E in allowing for a more reliable distinction between benign and malignant thyroid nodules. PMID:27762299

  3. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  4. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  5. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  6. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  8. Quantitative analysis of intermolecular interactions in orthorhombic rubrene.

    PubMed

    Hathwar, Venkatesha R; Sist, Mattia; Jørgensen, Mads R V; Mamakhel, Aref H; Wang, Xiaoping; Hoffmann, Christina M; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-09-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H-H interactions. The electron density features of H-H bonding, and the interaction energy of molecular dimers connected by H-H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  9. Temporal analysis of remotely sensed turbidity in a coastal archipelago

    NASA Astrophysics Data System (ADS)

    Suominen, Tapio; Tolvanen, Harri

    2016-07-01

    A topographically fragmental archipelago with dynamic waters set the preconditions for assessing coherent remotely sensed information. We generated a turbidity dataset for an archipelago coast in the Baltic Sea from MERIS data (FSG L1b), using CoastColour L1P, L2R and L2W processors. We excluded land and mixed pixels by masking the imagery with accurate (1:10 000) shoreline data. Using temporal linear averaging (TLA), we produced satellite-imagery datasets applicable to temporal composites for the summer seasons of three years. The turbidity assessments and temporally averaged data were compared to in situ observations obtained with coastal monitoring programs. The ability of TLA to estimate missing pixel values was further assessed by cross-validation with the leave-one-out method. The correspondence between L2W turbidity and in situ observations was good (r = 0.89), and even after applying TLA the correspondence remained acceptable (r = 0.78). The datasets revealed spatially divergent temporal water characteristics, which may be relevant to the management, design of monitoring and habitat models. Monitoring observations may be spatially biased if the temporal succession of water properties is not taken into account in coastal areas with anisotropic dispersion of waters and asynchronous annual cycles. Accordingly, areas of varying turbidity may offer a different habitat for aquatic biota than areas of static turbidity, even though they may appear similar if water properties are measured for short annual periods.

  10. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila.

    PubMed

    Itskov, Pavel M; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H; Ribeiro, Carlos

    2014-08-04

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake.

  11. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  12. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks.

  13. Quantitatively understanding cellular uptake of gold nanoparticles via radioactivity analysis

    PubMed Central

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-01-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26505012

  14. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    PubMed

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  15. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  16. The effect of pedigree complexity on quantitative trait linkage analysis.

    PubMed

    Dyer, T D; Blangero, J; Williams, J T; Göring, H H; Mahaney, M C

    2001-01-01

    Due to the computational difficulties of performing linkage analysis on large complex pedigrees, most investigators resort to simplifying such pedigrees by some ad hoc strategy. In this paper, we suggest an analytical method to compare the power of various pedigree simplification schemes by using the asymptotic distribution of the likelihood-ratio statistic. We applied the method to the large Hutterine pedigree. Our results indicate that the breaking and reduction of inbreeding loops can greatly diminish the power to localize quantitative trait loci. We also present an efficient Monte Carlo method for estimating identity-by-descent allele sharing in large complex pedigrees. This method is used to facilitate a linkage analysis of serum IgE levels in the Hutterites without simplifying the pedigree.

  17. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  18. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  19. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  20. Temporal scaling behavior of sea-level change in Hong Kong - Multifractal temporally weighted detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanzhi; Ge, Erjia

    2013-01-01

    The rise in global sea levels has been recognized by many scientists as an important global research issue. The process of sea-level change has demonstrated a complex scaling behavior in space and time. Large numbers of tide gauge stations have been built to measure sea-level change in the North Pacific Ocean, Indian Ocean, North Atlantic Ocean, and Antarctic Ocean. Extensive studies have been devoted to exploring sea-level variation in Asia concerning the Bohai Gulf (China), the Yellow Sea (China), the Mekong Delta (Thailand), and Singapore. Hong Kong, however, a mega city with a population of over 7 million situated in the mouth of the Pear River Estuary in the west and the South China Sea in the east, has yet to be studied, particularly in terms of the temporal scaling behavior of sea-level change. This article presents an approach to studying the temporal scaling behavior of sea-level change over multiple time scales by analyzing the time series of sea-level change in Tai Po Kou, Tsim Bei Tsui, and Quarry Bay from the periods of 1964-2010, 1974-2010, and 1986-2010, respectively. The detection of long-range correlation and multi-fractality of sea-level change seeks answers to the following questions: (1) Is the current sea-level rise associated with and responsible for the next rise over time? (2) Does the sea-level rise have specific temporal patterns manifested by multi-scaling behaviors? and (3) Is the sea-level rise is temporally heterogeneous in the different parts of Hong Kong? Multi-fractal temporally weighted de-trended fluctuation analysis (MF-TWDFA), an extension of multi-fractal de-trended fluctuation analysis (MF-DFA), has been applied in this study to identify long-range correlation and multi-scaling behavior of the sea-level rise in Hong Kong. The experimental results show that the sea-level rise is long-range correlated and multi-fractal. The temporal patterns are heterogeneous over space. This finding implies that mechanisms associated with the

  1. The workflow for quantitative proteome analysis of chloroplast development and differentiation, chloroplast mutants, and protein interactions by spectral counting.

    PubMed

    Friso, Giulia; Olinares, Paul Dominic B; van Wijk, Klaas J

    2011-01-01

    This chapter outlines a quantitative proteomics workflow using a label-free spectral counting technique. The workflow has been tested on different aspects of chloroplast biology in maize and Arabidopsis, including chloroplast mutant analysis, cell-type specific chloroplast differentiation, and the proplastid-to-chloroplast transition. The workflow involves one-dimensional SDS-PAGE of the proteomes of leaves or chloroplast subfractions, tryptic digestions, online LC-MS/MS using a mass spectrometer with high mass accuracy and duty cycle, followed by semiautomatic data processing. The bioinformatics analysis can effectively select best gene models and deals with quantification of closely related proteins; the workflow avoids overidentification of proteins and results in more accurate protein quantification. The final output includes pairwise comparative quantitative analysis, as well as hierarchical clustering for discovery of temporal and spatial patterns of protein accumulation. A brief discussion about potential pitfalls, as well as the advantages and disadvantages of spectral counting, is provided.

  2. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  3. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  4. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  5. Epistasis analysis for quantitative traits by functional regression model.

    PubMed

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  6. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  7. Fractal Spectrum Technique for Quantitative Analysis of Volcanic Particle Shapes

    NASA Astrophysics Data System (ADS)

    Maria, A. H.; Carey, S. N.

    2001-12-01

    The shapes of volcanic particles reflect numerous eruptive parameters (e.g. magma viscosity, volatile content, degree of interaction with water) and are useful for understanding fragmentation and transport processes associated with volcanic eruptions. However, quantitative analysis of volcanic particle shapes has proven difficult due to their morphological complexity and variability. Shape analysis based on fractal geometry has been successfully applied to a wide variety of particles and appears to be well suited for describing complex features. The technique developed and applied to volcanic particles in this study uses fractal data produced by dilation of the 2-D particle boundary to produce a full spectrum of fractal dimensions over a range of scales for each particle. Multiple fractal dimensions, which can be described as a fractal spectrum curve, are calculated by taking the first derivative of data points on a standard Richardson plot. Quantitative comparisons are carried out using multivariate statistical techniques such as cluster and principal components analysis. Compared with previous fractal methods that express shape in terms of only one or two fractal dimensions, use of multiple fractal dimensions results in more effective discrimination between samples. In addition, the technique eliminates the subjectivity associated with selecting linear segments on Richardson plots for fractal dimension calculation, and allows direct comparison of particles as long as instantaneous dimensions used as input to multivariate analyses are selected at the same scales for each particle. Applications to samples from well documented eruptions (e.g. Mt. St. Helens, Tambora, Surtsey) indicate that the fractal spectrum technique provides a useful means of characterizing volcanic particles and can be helpful for identifying the products of specific fragmentation processes (volatile exsolution, phreatomagmatic, quench granulation) and modes of volcanic deposition (tephra fall

  8. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  9. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  10. Final Report for Dynamic Models for Causal Analysis of Panel Data. Methods for Temporal Analysis. Part I, Chapter 1.

    ERIC Educational Resources Information Center

    Hannan, Michael T.; Tuma, Nancy Brandon

    This document is part of a series of chapters described in SO 011 759. Working from the premise that temporal analysis is indispensable for the study of change, the document examines major alternatives in research design of this nature. Five sections focus on the features, advantages, and limitations of temporal analysis. Four designs which…

  11. Advances in Temporal Analysis in Learning and Instruction

    ERIC Educational Resources Information Center

    Molenaar, Inge

    2014-01-01

    This paper focuses on a trend to analyse temporal characteristics of constructs important to learning and instruction. Different researchers have indicated that we should pay more attention to time in our research to enhance explanatory power and increase validity. Constructs formerly viewed as personal traits, such as self-regulated learning and…

  12. Cubic map algebra functions for spatio-temporal analysis

    USGS Publications Warehouse

    Mennis, J.; Viger, R.; Tomlin, C.D.

    2005-01-01

    We propose an extension of map algebra to three dimensions for spatio-temporal data handling. This approach yields a new class of map algebra functions that we call "cube functions." Whereas conventional map algebra functions operate on data layers representing two-dimensional space, cube functions operate on data cubes representing two-dimensional space over a third-dimensional period of time. We describe the prototype implementation of a spatio-temporal data structure and selected cube function versions of conventional local, focal, and zonal map algebra functions. The utility of cube functions is demonstrated through a case study analyzing the spatio-temporal variability of remotely sensed, southeastern U.S. vegetation character over various land covers and during different El Nin??o/Southern Oscillation (ENSO) phases. Like conventional map algebra, the application of cube functions may demand significant data preprocessing when integrating diverse data sets, and are subject to limitations related to data storage and algorithm performance. Solutions to these issues include extending data compression and computing strategies for calculations on very large data volumes to spatio-temporal data handling.

  13. Temporal proteomic analysis of HIV infection reveals remodelling of the host phosphoproteome by lentiviral Vif variants

    PubMed Central

    Greenwood, Edward JD; Matheson, Nicholas J; Wals, Kim; van den Boomen, Dick JH; Antrobus, Robin; Williamson, James C; Lehner, Paul J

    2016-01-01

    Viruses manipulate host factors to enhance their replication and evade cellular restriction. We used multiplex tandem mass tag (TMT)-based whole cell proteomics to perform a comprehensive time course analysis of >6500 viral and cellular proteins during HIV infection. To enable specific functional predictions, we categorized cellular proteins regulated by HIV according to their patterns of temporal expression. We focussed on proteins depleted with similar kinetics to APOBEC3C, and found the viral accessory protein Vif to be necessary and sufficient for CUL5-dependent proteasomal degradation of all members of the B56 family of regulatory subunits of the key cellular phosphatase PP2A (PPP2R5A-E). Quantitative phosphoproteomic analysis of HIV-infected cells confirmed Vif-dependent hyperphosphorylation of >200 cellular proteins, particularly substrates of the aurora kinases. The ability of Vif to target PPP2R5 subunits is found in primate and non-primate lentiviral lineages, and remodeling of the cellular phosphoproteome is therefore a second ancient and conserved Vif function. DOI: http://dx.doi.org/10.7554/eLife.18296.001 PMID:27690223

  14. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  15. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  16. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  17. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  18. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  19. Application of Synchrotron-XRF to Quantitative Elemental Aerosol Analysis

    NASA Astrophysics Data System (ADS)

    Cliff, S. S.; Perry, K. D.; Jimenez-Cruz, M. P.; Cahill, T. A.

    2001-12-01

    Recent advances in synchrotron x-ray fluorescence (s-XRF) analysis of atmospheric particulate matter have improved elemental sensitivity, quantification and time-resolution. Analysis of both filter and impactor based aerosol samples have yielded quantitative data for elements Na-U, if present, in ambient aerosols. The increased sensitivity allows higher time resolution through either smaller spatial analysis of time-resolved impactor samples or shorter sample time-integration using filter-based samplers. Of particular interest is the application of s-XRF to aerodynamically sized rotating substrate impactor samples. These samplers, 8- and 3-stage DRUM's, have the ability to aerodynamically size-classify particles in either 8 or 3 categories, respectively. In addition, the rotating substrate allows time-resolved analysis of samples with little or no loss in elemental sensitivity. The s-XRF analyses are performed on Beamline 10.3.1 at the Advanced Light Source-Lawrence Berkeley Laboratory (ALS-LBL). Beamline 10.3.1, originally designed for materials analysis, has been supplemented with aerosol analysis capability from several substrate options. Typical analysis involves Teflon filters or Mylar impaction substrates. The newly formed Participating Research Team (PRT) for beamline 10.3.1 encompasses both global climate and material science research. The s-XRF capabilities of beamline 10.3.1 are now available for PRT researchers and independent investigators through a proposal process to the ALS. The technology, application to aerosol research and monitoring, and availability of the facility to the aerosol research community will be presented.

  20. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  1. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  2. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  3. Preparation of Buffers. An Experiment for Quantitative Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Buckley, P. T.

    2001-10-01

    In our experience, students who have a solid grounding in the theoretical aspects of buffers, buffer preparation, and buffering capacity are often at a loss when required to actually prepare a buffer in a research setting. However, there are very few published laboratory experiments pertaining to buffers. This laboratory experiment for the undergraduate quantitative analysis lab gives students hands-on experience in the preparation of buffers. By preparing a buffer to a randomly chosen pH value and comparing the theoretical pH to the actual pH, students apply their theoretical understanding of the Henderson-Hasselbalch equation, activity coefficients, and the effect of adding acid or base to a buffer. This experiment gives students experience in buffer preparation for research situations and helps them in advanced courses such as biochemistry where a fundamental knowledge of buffer systems is essential.

  4. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  5. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  6. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  7. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  8. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  9. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  10. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  11. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  12. Mass spectrometry-based quantitative analysis and biomarker discovery.

    PubMed

    Suzuki, Naoto

    2011-01-01

      Mass spectrometry-based quantitative analysis and biomarker discovery using metabolomics approach represent one of the major platforms in clinical fields including for the prognosis or diagnosis, assessment of severity and response to therapy in a number of clinical disease states as well as therapeutic drug monitoring (TDM). This review first summarizes our mass spectrometry-based research strategy and some results on relationship between cysteinyl leukotriene (cysLT), thromboxane (TX), 12-hydroxyeicosatetraenoic acid (12-HETE) and other metabolites of arachidonic acid and diseases such as atopic dermatitis, rheumatoid arthritis and diabetes mellitus. For the purpose of evaluating the role of these metabolites of arachidonic acid in disease status, we have developed sensitive determination methods with simple solid-phase extraction and applied in clinical settings. In addition to these endogenous compounds, using mass spectrometry, we have developed actually applicable quantitative methods for TDM. Representative example was a method of TDM for sirolimus, one of the immunosuppressant agents for a recipient of organ transplant, which requires rigorous monitoring of blood level. As we recognized great potential in mass spectrometry during these researches, we have become interested in metabolomics as the non-targeted analysis of metabolites. Now, established strategy for the metabolomics investigation applies to samples from cells, animals and humans to separate groups based on altered patterns of metabolites in biological fluids and to identify metabolites as potential biomarkers discriminating groups. We would be honored if our research using mass spectrometry would contribute to provide useful information in the field of medical pharmacy. PMID:21881303

  13. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  14. Hydrodynamics Flow and Transport Characterization of a Karstified Physical Model Using Temporal Moment Analysis

    NASA Astrophysics Data System (ADS)

    Anaya, A. A.; Padilla, I. Y.

    2013-12-01

    High productivity of karst groundwater systems is often associated with conduit flow and high matrix permeability. Spatial heterogeneities and anisotropy, among others factors, result in highly complex flow patterns in these systems. The same characteristics that make these aquifers very productive also make them highly vulnerable to contamination and a likely for contaminant exposure. The understanding of contamination fate and transport processes in these complex aquifers demand different statistical and numerical approaches, such as the Temporal Moment Analysis (TMA). TMA of solute breakthrough curves provide qualitative and quantitative results to characterize hydrodynamic variables that affect the release, mobility, persistence, and possible pathways of contaminants in karst groundwater systems. The general objective of this work is to characterize flow and transport processes in conduit and diffusion-dominated flow under low and high flow conditions using TMA in a karstified physical model. A multidimensional, laboratory-scale, Geo-Hydrobed model (GHM) containing a karstified limestone block collected from the karst aquifer formation of northern Puerto Rico are used for this purpose. Experimental work entails injecting dissolved CaCl2 and trichloroethene (TCE) in the upstream boundary of the GHM while monitoring their concentrations spatially and temporally in the limestone under different groundwater flow regimes. Results from the TMA show a highly heterogeneous system resulting in large preferential flow components and specific mass-transfer limitations zones especially in diffuse flow areas. Flow variables like velocity and Reynolds number indicates defined preferential flow paths increasing spatially as flow rate increase. TMA results show to be qualitatively consistent with a previous statistical novel approach developed using mixed models. Comparison between the dissolved CaCl2 tracer and TCE show implications for reactive contaminants in the karst

  15. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  16. Temporal coherence and attention in auditory scene analysis

    PubMed Central

    Shamma, Shihab A.; Elhilali, Mounya; Micheyl, Christophe

    2010-01-01

    Humans and other animals can attend to one of multiple sounds, and follow it selectively over time. The neural underpinnings of this perceptual feat remain mysterious. Some studies have concluded that sounds are heard as separate streams when they activate well-separated populations of central auditory neurons, and that this process is largely pre-attentive. Here, we argue instead that stream formation depends primarily on temporal coherence between responses that encode various features of a sound source. Furthermore, we postulate that only when attention is directed towards a particular feature (e.g., pitch) do all other temporally coherent features of that source (e.g., timbre and location) become bound together as a stream that is segregated from the incoherent features of other sources. PMID:21196054

  17. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  18. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  19. Multiple Trait Analysis of Genetic Mapping for Quantitative Trait Loci

    PubMed Central

    Jiang, C.; Zeng, Z. B.

    1995-01-01

    We present in this paper models and statistical methods for performing multiple trait analysis on mapping quantitative trait loci (QTL) based on the composite interval mapping method. By taking into account the correlated structure of multiple traits, this joint analysis has several advantages, compared with separate analyses, for mapping QTL, including the expected improvement on the statistical power of the test for QTL and on the precision of parameter estimation. Also this joint analysis provides formal procedures to test a number of biologically interesting hypotheses concerning the nature of genetic correlations between different traits. Among the testing procedures considered are those for joint mapping, pleiotropy, QTL by environment interaction, and pleiotropy vs. close linkage. The test of pleiotropy (one pleiotropic QTL at a genome position) vs. close linkage (multiple nearby nonpleiotropic QTL) can have important implications for our understanding of the nature of genetic correlations between different traits in certain regions of a genome and also for practical applications in animal and plant breeding because one of the major goals in breeding is to break unfavorable linkage. Results of extensive simulation studies are presented to illustrate various properties of the analyses. PMID:7672582

  20. A Temporal Examination of the Planktonic and Biofilm Proteome of Whole Cell Pseudomonas aeruginosa PAO1 Using Quantitative Mass Spectrometry*

    PubMed Central

    Park, Amber J.; Murphy, Kathleen; Krieger, Jonathan R.; Brewer, Dyanne; Taylor, Paul; Habash, Marc; Khursigara, Cezar M.

    2014-01-01

    Chronic polymicrobial lung infections are the chief complication in patients with cystic fibrosis. The dominant pathogen in late-stage disease is Pseudomonas aeruginosa, which forms recalcitrant, structured communities known as biofilms. Many aspects of biofilm biology are poorly understood; consequently, effective treatment of these infections is limited, and cystic fibrosis remains fatal. Here we combined in-solution protein digestion of triplicate growth-matched samples with a high-performance mass spectrometry platform to provide the most comprehensive proteomic dataset known to date for whole cell P. aeruginosa PAO1 grown in biofilm cultures. Our analysis included protein–protein interaction networks and PseudoCAP functional information for unique and significantly modulated proteins at three different time points. Secondary analysis of a subgroup of proteins using extracted ion currents validated the spectral counting data of 1884 high-confidence proteins. In this paper we demonstrate a greater representation of proteins related to metabolism, DNA stability, and molecular activity in planktonically grown P. aeruginosa PAO1. In addition, several virulence-related proteins were increased during planktonic growth, including multiple proteins encoded by the pyoverdine locus, uncharacterized proteins with sequence similarity to mammalian cell entry protein, and a member of the hemagglutinin family of adhesins, HecA. Conversely, biofilm samples contained an uncharacterized protein with sequence similarity to an adhesion protein with self-association characteristics (AidA). Increased levels of several phenazine biosynthetic proteins, an uncharacterized protein with sequence similarity to a metallo-beta-lactamase, and lower levels of the drug target gyrA support the putative characteristics of in situ P. aeruginosa infections, including competitive fitness and antibiotic resistance. This quantitative whole cell approach advances the existing P. aeruginosa

  1. System for quantitative analysis of coronary calcification via electron-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Reed, Judd E.; Rumberger, John A.; Davitt, Patrick J.; Kaufman, R. B.; Sheedy, Patrick F., II

    1994-05-01

    Electron beam computed tomography (EBCT) has provided a new tool for identification and possible quantification of coronary arterial plaque calcium. EBCT is the only imaging modality currently available which generates images of the spatial, temporal, and contrast resolution required for the identification of small foci of calcium and the potential for accurate quantification of calcium. Meanwhile, interest in quantification of coronary arterial calcium via EBCT and its correlation with severity of coronary atherosclerosis is increasing. Data remain inconclusive, but it appears that the reproducibility of quantitative grading of the extent of calcification by EBCT may be limited, in part, by the arbitrary nature of the scoring algorithm employed within the analysis tools currently provided by the EBCT manufacturer. It has not been possible to objectively determine optimum values for minimum plaque area and brightness threshold or to quantitatively determine whether single optimal values even exist. Also, although the current system tabulates the score, area, and mean attenuation for each plaque, the locations of the plaques are not reported.

  2. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  3. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  4. Temporal-resolved characterization of laser-induced plasma for spectrochemical analysis of gas shales

    NASA Astrophysics Data System (ADS)

    Xu, Tao; Zhang, Yong; Zhang, Ming; He, Yi; Yu, Qiaoling; Duan, Yixiang

    2016-07-01

    Optical emission of laser ablation plasma on a shale target surface provides sensitive laser-induced breakdown spectrometry (LIBS) detection of major, minor or trace elements. An exploratory study for the characterization of the plasma induced on shale materials was carried out with the aim to trigger a crucial step towards the quantitative LIBS measurement. In this work, the experimental strategies that optimize the plasma generation on a pressed shale pellet surface are presented. The temporal evolution properties of the plasma induced by ns Nd:YAG laser pulse at the fundamental wavelength in air were investigated using time-resolved space-integrated optical emission spectroscopy. The electron density as well as the temperatures of the plasma were diagnosed as functions of the decay time for the bulk plasma analysis. In particular, the values of time-resolved atomic and ionic temperatures of shale elements, such as Fe, Mg, Ca, and Ti, were extracted from the well-known Boltzmann or Saha-Boltzmann plot method. Further comparison of these temperatures validated the local thermodynamic equilibrium (LTE) within specific interval of the delay time. In addition, the temporal behaviors of the signal-to-noise ratio of shale elements, including Si, Al, Fe, Ca, Mg, Ba, Li, Ti, K, Na, Sr, V, Cr, and Ni, revealed the coincidence of their maximum values with LIBS LTE condition in the time frame, providing practical implications for an optimized LIBS detection of shale elements. Analytical performance of LIBS was further evaluated with the linear calibration procedure for the most concerned trace elements of Sr, V, Cr, and Ni present in different shales. Their limits of detection obtained are elementally dependent and can be lower than tens of parts per million with the present LIBS experimental configurations. However, the occurrence of saturation effect for the calibration curve is still observable with the increasing trace element content, indicating that, due to the

  5. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  6. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  7. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  8. Quantitative analysis of fault slip evolution in analogue transpression models

    NASA Astrophysics Data System (ADS)

    Leever, Karen; Gabrielsen, Roy H.; Schmid, Dani; Braathen, Alvar

    2010-05-01

    A quantitative analysis of fault slip evolution in crustal scale brittle and brittle-ductile analogue models of doubly vergent transpressional wedges was performed by means of Particle Image Velocimetry (PIV). The kinematic analyses allow detailed comparison between model results and field kinematic data. This novel approach leads to better understanding of the evolution of transpressional orogens such as the Tertiary West Spitsbergen fold and thrust belt in particular and will advance the understanding of transpressional wedge mechanics in general. We ran a series of basal-driven models with convergence angles of 4, 7.5, 15 and 30 degrees. In these crustal scale models, brittle rheology was represented by quartz sand; in one model a viscous PDMS layer was included at shallow depth. Total sand pack thickness was 6cm, its extent 120x60cm. The PIV method was used to calculate a vector field from pairs of images that were recorded from the top of the experiments at a 2mm displacement increment. The slip azimuth on discrete faults was calculated and visualized by means of a directional derivative of this vector field. From this data set, several stages in the evolution of the models could be identified. The stages were defined by changes in the degree of displacement partitioning, i.e. slip along-strike and orthogonal to the plate boundary. A first stage of distributed strain (with no visible faults at the model surface) was followed by a shear lens stage with oblique displacement on pro- and retro-shear. The oblique displacement became locally partitioned during progressive displacement. During the final stage, strain was more fully partitioned between a newly formed central strike slip zone and reverse faults at the sides. Strain partitioning was best developed in the 15 degrees model, which shows near-reverse faults along both sides of the wedge in addition to strike slip displacement in the center. In further analysis we extracted average slip vectors for

  9. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  10. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  11. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  12. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  13. [Quantitative Analysis of Mn in Soil Samples Using LIBS].

    PubMed

    Zhang, Bao-hua; Jiang, Yong-cheng; Zhang, Xian-yan; Cui, Zhi-feng

    2015-06-01

    The trace element of Manganese element in the agricultural farm (Anhui Huaiyuan Nongkang) soil was quantitatively analyzed by Laser-induced breakdown spectroscopy. The line of 403.1 nm was selected as the analysis line of Mn. The matrix element of Fe in soil was chosen as the internal calibration element and the analysis line was 407.2 nm. Ten soil samples were used to construct calibration curves with traditional method and internal standard method, and four soil samples were selected as test samples. The experimental results showed that the fitting correlation coefficient (r) is 0.954 when using the traditional method, the maximum relative error of the measurement samples is 5.72%, and the detection limit of Mn in soil is 93 mg x kg(-1). While using the internal standard method to construct the calibration curve, the fitting correlation coefficient (r) is 0.983, the relative error of measurement samples is reduced to 4.1%, and the detection limit of Mn in soil is 71 mg x kg(-1). The result indicates that LIBS technique can be used to detect trace element Mn in soil. In a certain extent, the internal standard method can improve the accuracy of measurement.

  14. Quantitative analysis of polyethylene blends by Fourier transform infrared spectroscopy.

    PubMed

    Cran, Marlene J; Bigger, Stephen W

    2003-08-01

    The quantitative analysis of binary polyethylene (PE) blends by Fourier transform infrared (FT-IR) spectroscopy has been achieved based on the ratio of two absorbance peaks in an FT-IR spectrum. The frequencies for the absorbance ratio are selected based on structural entities of the PE components in the blend. A linear relationship between the absorbance ratio and the blend composition was found to exist if one of the absorbance peaks is distinct to one of the components and the other peak is common to both components. It was also found that any peak resulting from short-chain branching in copolymers (such as linear low-density polyethylene (LLDPE) or metallocene-catalyzed LLDPE (mLLDPE)), is suitable for use as the peak that is designated as being distinct to that component. In order to optimize the linearity of the equation, however, the selection of the second common peak is the most important and depends on the blend system studied. Indeed, under certain circumstances peaks that are not spectrally distinct can be used successfully to apply the method. The method exhibits potential for the routine analysis of PE blends that have been calibrated prior to its application.

  15. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  16. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  17. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  18. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  19. Analysis of temporal jitter in a copper vapor laser system.

    PubMed

    Durga Praveen Kumar, D; Gantayet, L M; Singh, Sunita; Rawat, A S; Rana, Paramjit; Rajasree, V; Agarwalla, Sandeep K; Chakravarthy, D P

    2012-02-01

    Temporal jitter in a magnetic pulse compression based copper vapor laser (CVL) system is analyzed by considering ripple present in the input dc power supply and ripple present in the magnetic core resetting power supply. It is shown that the jitter is a function of the ratio of operating voltage to the designed voltage, percentage ripple, and the total propagation delay of the magnetic pulse compression circuit. Experimental results from a CVL system operating at a repetition rate of 9 kHz are presented. PMID:22380123

  20. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  1. Quantitative increases in temporal lobe symptoms in human males are proportional to postnatal geomagnetic activity: verification by canonical correlation.

    PubMed

    Hodge, K A; Persinger, M A

    1991-04-29

    Enhanced geomagnetic activity during episodes of biochemical stress has been correlated with inferences of increased liability within deep temporal lobe structures. Because adult limbic epilepsy is frequently associated with perinatal hypoxia or metabolic disruption within this region, a weak positive correlation was expected between possible signs of mesiobasal temporal lobe lability in normal adults and perinatal geomagnetic activity. Canonical correlation demonstrated that young adult males (n = 243) displayed a positive (r = 0.31) relationship between the intensity of geomagnetic disturbance the day after birth only and a history of subjective depersonalization, anomalous visual and olfactory experiences. The effects was very clear when aa values exceeded 30 nT (gamma). Temporal lobe signs for these males were similar to those reported by normal young adult females (n = 313) who did not display any consistent correlation between these measures and perinatal geomagnetic disturbance. The results suggest that interactions between perinatal neurochemistry and the correlates of geomagnetic activity might permanently alter portions of the male limbic system. PMID:1881599

  2. Ontology Driven Analysis of Spatio-temporal Phenomena, Aimed At Spatial Planning And Environmental Forecasting

    NASA Astrophysics Data System (ADS)

    Iwaniak, A.; Łukowicz, J.; Strzelecki, M.; Kaczmarek, I.

    2013-10-01

    Spatial planning is a crucial area for balancing civilization development with environmental protection. Spatial planning has a multidisciplinary nature. It must take into account the dynamics of the processes, which could affect the integrity of the environmental system. That is why we need a new approach to modelling phenomena occurring in space. Such approach is offered by ontologies, based on Description Logic (DL) and related to inference systems. Ontology is a system for the knowledge representation, including conceptual scheme and based on this scheme representation of reality. Ontologies can be enriched with additional logical systems. The authors present a way of building domain ontologies for spatial planning, including the representation of spatio-temporal phenomena. Description Logic is supplemented by structures of temporal logic. As a result, the analysis for exploring the topological relations between spatial objects will be extended to include temporal relationships: coincidence, precedence and succession, cause and effect relationship. Spatio-temporal models with temporal logic structures, encoded in ontologies, could be a subject of inference process, performed by semantic reasoners (reasoner engines). Spatio-temporal representations are offered, by so-called upper ontologies, such as GFO, BFO, OCHRE and others. Temporal structures provided in such ontologies, are useful for the analysis of data obtained from environmental and development monitoring systems and for description and representation of historical phenomena. They allow creating the models and scenarios of expected spatial transformation. They will support analysis for spatial development design, decision-making in spatial planning and forecasting of environmental impact.

  3. Research of spatio-temporal analysis of agricultural pest

    NASA Astrophysics Data System (ADS)

    Wang, Changwei; Li, Deren; Hu, Yueming; Wu, Xiaofang; Qi, Yu

    2009-10-01

    The increase of agricultural pest disasters in recent years has become one of major problems in agriculture harvest; how to predict and control the disasters of agricultural pest has thus attracted great research interest. Although a series of works have been done and some achievements have been attained, the knowledge in this area remains limited. The migration of agricultural pest is not only related to the time variation, but also the space; consequently, the population of agricultural pest has complex spatio-temporal characteristics. The space factor and the temporal factor must be considered at the same time in the research of dynamics changes of the pest population. Using plant hoppers as an object of study, this study employed the biological analogy deviation model to study the distribution of pest population in different periods of time in Guangdong Province. It is demonstrated that the population distribution of plant hoppers is not only related to the space location, but also has a certain direction. The result reported here offers help to the monitor, prevention and control of plant hoppers in Guangdong Provinces.

  4. Spatio-temporal curvature measures for flow-field analysis

    NASA Astrophysics Data System (ADS)

    Zetzsche, Christoph; Barth, Erhardt; Berkmann, Joachim

    1991-09-01

    Intrinsic signal dimensionality, a property closely related to Gaussian curvature, is shown to be an important conceptual tool in multi-dimensional image processing for both biological and engineering sciences. Intrinsic dimensionality can reveal the relationship between recent theoretical developments in the definition of optic flow and the basic neurophysiological concept of 'end-stopping' of visual cortical cells. It is further shown how the concept may help to avoid certain problems typically arising from the common belief that an explicit computation of a flow field has to be the essential first step in the processing of spatio- temporal image sequences. Signals which cause difficulties in the computation of optic flow, mainly the discontinuities of the motion vector field, are shown to be detectable directly in the spatio-temporal input by evaluation of its three-dimensional curvature. The relevance of the suggested concept is supported by the fact that fast and efficient detection of such signals is of vital importance for ambulant observers in both the biological and the technical domain.

  5. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  6. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  7. Timing of omitted events: an analysis of temporal control of inhibitory behavior.

    PubMed

    Denniston, James C; Miller, Ralph R

    2007-02-22

    This paper reviews research designed to investigate the temporal control of inhibitory responding using rats as subjects. One area of investigation has focused on the role of temporal variables in conditioned inhibition produced using Pavlov's [Pavlov, I.P., 1927. Conditioned Reflexes. Oxford University Press, London, 430 pp.] procedure. These studies have found that evidence of conditioned inhibition obtained by negative summation testing is strongest when the conditioned inhibitor signals the omission of the unconditioned stimulus (US) at the same temporal location as a transfer excitor signals presentation of the US [e.g., Barnet, R.C., Miller, R.R., 1996. Temporal encoding as a determinant of inhibitory control. Learn. Motiv. 27, 73-91]. Similarly, retardation of acquisition of behavioral control by a previously inhibitory conditioned stimulus (CS) is maximal when the inhibitory CS is paired with the US at the same temporal location as the inhibitor had previously signaled US omission [Burger, D., Denniston, J.C., Miller, R.R., 2001. Temporal coding in condition inhibition: retardation tests. Anim. Learn. Behav. 29, 281-290]. Other lines of research designed to assess the associative structure of temporal control of inhibition [e.g., Denniston, J.C., Blaisdell, A.P., Miller, R.R., 2004. Temporal control in conditioned inhibition: analysis of associative structure of inhibition. J. Exp. Psychol. Anim. Behav. Process. 30, 190-202] are reviewed, as is the assessment of temporal control of inhibition produced through extinction [Denniston, J.C., Miller, R.R., 2003. The role of temporal variables in inhibition produced through extinction. Learn. Behav. 31, 35-48]. These collective observations are discussed in terms of the temporal coding hypothesis [Matzel, L.D., Held, F.P., Miller, R.R., 1988. Reexamination of simultaneous and backward conditioning: Implications for contiguity theory. Learn. Motiv. 19, 317-344]. PMID:17194549

  8. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  9. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  10. Comparison of multivariate calibration methods for quantitative spectral analysis

    SciTech Connect

    Thomas, E.V.; Haaland, D.M. )

    1990-05-15

    The quantitative prediction abilities of four multivariate calibration methods for spectral analyses are compared by using extensive Monte Carlo simulations. The calibration methods compared include inverse least-squares (ILS), classical least-squares (CLS), partial least-squares (PLS), and principal component regression (PCR) methods. ILS is a frequency-limited method while the latter three are capable of full-spectrum calibration. The simulations were performed assuming Beer's law holds and that spectral measurement errors and concentration errors associated with the reference method are normally distributed. Eight different factors that could affect the relative performance of the calibration methods were varied in a two-level, eight-factor experimental design in order to evaluate their effect on the prediction abilities of the four methods. It is found that each of the three full-spectrum methods has its range of superior performance. The frequency-limited ILS method was never the best method, although in the presence of relatively large concentration errors it sometimes yields comparable analysis precision to the full-spectrum methods for the major spectral component. The importance of each factor in the absolute and relative performances of the four methods is compared.

  11. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  12. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  13. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  14. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  15. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  16. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  17. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  18. Active contour approach for accurate quantitative airway analysis

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Slabaugh, Greg G.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois

    2008-03-01

    Chronic airway disease causes structural changes in the lungs including peribronchial thickening and airway dilatation. Multi-detector computed tomography (CT) yields detailed near-isotropic images of the lungs, and thus the potential to obtain quantitative measurements of lumen diameter and airway wall thickness. Such measurements would allow standardized assessment, and physicians to diagnose and locate airway abnormalities, adapt treatment, and monitor progress over time. However, due to the sheer number of airways per patient, systematic analysis is infeasible in routine clinical practice without automation. We have developed an automated and real-time method based on active contours to estimate both airway lumen and wall dimensions; the method does not require manual contour initialization but only a starting point on the targeted airway. While the lumen contour segmentation is purely region-based, the estimation of the outer diameter considers the inner wall segmentation as well as local intensity variation, in order anticipate the presence of nearby arteries and exclude them. These properties make the method more robust than the Full-Width Half Maximum (FWHM) approach. Results are demonstrated on a phantom dataset with known dimensions and on a human dataset where the automated measurements are compared against two human operators. The average error on the phantom measurements was 0.10mm and 0.14mm for inner and outer diameters, showing sub-voxel accuracy. Similarly, the mean variation from the average manual measurement was 0.14mm and 0.18mm for inner and outer diameters respectively.

  19. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  20. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems.

  1. Quantitative image analysis of cell colocalization in murine bone marrow.

    PubMed

    Mokhtari, Zeinab; Mech, Franziska; Zehentmeier, Sandra; Hauser, Anja E; Figge, Marc Thilo

    2015-06-01

    Long-term antibody production is a key property of humoral immunity and is accomplished by long-lived plasma cells. They mainly reside in the bone marrow, whose importance as an organ hosting immunological memory is becoming increasingly evident. Signals provided by stromal cells and eosinophils may play an important role for plasma cell maintenance, constituting a survival microenvironment. In this joint study of experiment and theory, we investigated the spatial colocalization of plasma cells, eosinophils and B cells by applying an image-based systems biology approach. To this end, we generated confocal fluorescence microscopy images of histological sections from murine bone marrow that were subsequently analyzed in an automated fashion. This quantitative analysis was combined with computer simulations of the experimental system for hypothesis testing. In particular, we tested the observed spatial colocalization of cells in the bone marrow against the hypothesis that cells are found within available areas at positions that were drawn from a uniform random number distribution. We find that B cells and plasma cells highly colocalize with stromal cells, to an extent larger than in the simulated random situation. While B cells are preferentially in contact with each other, i.e., form clusters among themselves, plasma cells seem to be solitary or organized in aggregates, i.e., loosely defined groups of cells that are not necessarily in direct contact. Our data suggest that the plasma cell bone marrow survival niche facilitates colocalization of plasma cells with stromal cells and eosinophils, respectively, promoting plasma cell longevity.

  2. Quantitative genetic analysis of flowering time in tomato.

    PubMed

    Jiménez-Gómez, José M; Alonso-Blanco, Carlos; Borja, Alicia; Anastasio, Germán; Angosto, Trinidad; Lozano, Rafael; Martínez-Zapater, José M

    2007-03-01

    Artificial selection of cultivated tomato (Solanum lycopersicum L.) has resulted in the generation of early-flowering, day-length-insensitive cultivars, despite its close relationship to other Solanum species that need more time and specific photoperiods to flower. To investigate the genetic mechanisms controlling flowering time in tomato and related species, we performed a quantitative trait locus (QTL) analysis for flowering time in an F2 mapping population derived from S. lycopersicum and its late-flowering wild relative S. chmielewskii. Flowering time was scored as the number of days from sowing to the opening of the first flower (days to flowering), and as the number of leaves under the first inflorescence (leaf number). QTL analyses detected 2 QTLs affecting days to flowering, which explained 55.3% of the total phenotypic variance, and 6 QTLs for leaf number, accounting for 66.7% of the corresponding phenotypic variance. Four of the leaf number QTLs had not previously been detected for this trait in tomato. Colocation of some QTLs with flowering-time genes included in the genetic map suggests PHYB2, FALSIFLORA, and a tomato FLC-like sequence as candidate genes that might have been targets of selection during the domestication of tomato.

  3. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    PubMed

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  4. Limits of normality of quantitative thoracic CT analysis

    PubMed Central

    2013-01-01

    Introduction Although computed tomography (CT) is widely used to investigate different pathologies, quantitative data from normal populations are scarce. Reference values may be useful to estimate the anatomical or physiological changes induced by various diseases. Methods We analyzed 100 helical CT scans taken for clinical purposes and referred as nonpathological by the radiologist. Profiles were manually outlined on each CT scan slice and each voxel was classified according to its gas/tissue ratio. For regional analysis, the lungs were divided into 10 sterno-vertebral levels. Results We studied 53 males and 47 females (age 64 ± 13 years); males had a greater total lung volume, lung gas volume and lung tissue. Noninflated tissue averaged 7 ± 4% of the total lung weight, poorly inflated tissue averaged 18 ± 3%, normally inflated tissue averaged 65 ± 8% and overinflated tissue averaged 11 ± 7%. We found a significant correlation between lung weight and subject's height (P <0.0001, r2 = 0.49); the total lung capacity in a supine position was 4,066 ± 1,190 ml, ~1,800 ml less than the predicted total lung capacity in a sitting position. Superimposed pressure averaged 2.6 ± 0.5 cmH2O. Conclusion Subjects without lung disease present significant amounts of poorly inflated and overinflated tissue. Normal lung weight can be predicted from patient's height with reasonable confidence. PMID:23706034

  5. Dynamics and spatio-temporal variability of environmental factors in Eastern Australia using functional principal component analysis

    USGS Publications Warehouse

    Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.

    2010-01-01

    This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.

  6. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    NASA Astrophysics Data System (ADS)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  7. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  8. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  9. Quantitative petrographic analysis of Cretaceous sandstones from southwest Montana

    SciTech Connect

    Dyman, T.S. Krystinik, K.B.; Takahashi, K.I.

    1986-05-01

    The Albian Blackleaf Formation and the Cenomanian lower Frontier Formation in southwest Montana lie within or east of the fold and thrust belt in the Cretaceous foreland basin complex. Petrography of these strata record a complex interaction between source-area tectonism, basin subsidence, and sedimentation patterns associated with a cyclic sequence of transgressions and regressions. Because the petrographic data set was large (127 thin sections) and difficult to interpret subjectively, statistical techniques were used to establish sample and variable relationships. Theta-mode cluster and correspondence analysis were used to determine the contributing effect (total variance) of key framework grains. Monocrystalline quartz, plagioclase, potassium feldspar, and sandstone-, limestone-, and volcanic-lithic grain content contribute most to the variation in the framework-grain population. Theta-mode cluster and correspondence analysis were used to identify six petrofacies. Lower Blackleaf petrofacies (I-III) contain abundant monocrystalline quartz (55-90%) and sedimentary lithic grains (10-50%), which are distributed throughout the study area. Petrofacies I-III are differentiated by variable monocrystalline quartz and sedimentary lithic grain content. Upper Blackleaf and lower Frontier petrofacies (IV-VI) exhibit highly variable, sedimentary and volcanic lithic ratios, and contain less monocrystalline quartz (20-50%) than lower Blackleaf petrofacies. Information from quantitative analyses combined with available paleocurrent data indicates that Blackleaf and lower Frontier detritus was derived from variable source areas through time. Lower Blackleaf detritus was derived from Precambrian through Paleozoic sedimentary terranes to the west, north, and east; whereas, upper Blackleaf and lower Frontier detritus was derived from both sedimentary and volcanic terranes to the south.

  10. Gibbs distribution analysis of temporal correlations structure in retina ganglion cells

    PubMed Central

    Vasquez, J. C.; Marre, O.; Palacios, A.G.; Berry, M.J.; Cessac, B.

    2012-01-01

    We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (Marre et al. (2009)) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions. PMID:22115900

  11. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    NASA Technical Reports Server (NTRS)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  12. Temporal and Spatial Analysis of Monogenetic Volcanic Fields

    NASA Astrophysics Data System (ADS)

    Kiyosugi, Koji

    Achieving an understanding of the nature of monogenetic volcanic fields depends on identification of the spatial and temporal patterns of volcanism in these fields, and their relationships to structures mapped in the shallow crust and inferred in the deep crust and mantle through interpretation of geochemical, radiometric and geophysical data. We investigate the spatial and temporal distributions of volcanism in the Abu Monogenetic Volcano Group, Southwest Japan. E-W elongated volcano distribution, which is identified by a nonparametric kernel method, is found to be consistent with the spatial extent of P-wave velocity anomalies in the lower crust and upper mantle, supporting the idea that the spatial density map of volcanic vents reflects the geometry of a mantle diapir. Estimated basalt supply to the lower crust is constant. This observation and the spatial distribution of volcanic vents suggest stability of magma productivity and essentially constant two-dimensional size of the source mantle diapir. We mapped conduits, dike segments, and sills in the San Rafael sub-volcanic field, Utah, where the shallowest part of a Pliocene magmatic system is exceptionally well exposed. The distribution of conduits matches the major features of dike distribution, including development of clusters and distribution of outliers. The comparison of San Rafael conduit distribution and the distributions of volcanoes in several recently active volcanic fields supports the use of statistical models, such as nonparametric kernel methods, in probabilistic hazard assessment for distributed volcanism. We developed a new recurrence rate calculation method that uses a Monte Carlo procedure to better reflect and understand the impact of uncertainties of radiometric age determinations on uncertainty of recurrence rate estimates for volcanic activity in the Abu, Yucca Mountain Region, and Izu-Tobu volcanic fields. Results suggest that the recurrence rates of volcanic fields can change by more

  13. Application of 3D Spatio-Temporal Data Modeling, Management, and Analysis in DB4GEO

    NASA Astrophysics Data System (ADS)

    Kuper, P. V.; Breunig, M.; Al-Doori, M.; Thomsen, A.

    2016-10-01

    Many of todaýs world wide challenges such as climate change, water supply and transport systems in cities or movements of crowds need spatio-temporal data to be examined in detail. Thus the number of examinations in 3D space dealing with geospatial objects moving in space and time or even changing their shapes in time will rapidly increase in the future. Prominent spatio-temporal applications are subsurface reservoir modeling, water supply after seawater desalination and the development of transport systems in mega cities. All of these applications generate large spatio-temporal data sets. However, the modeling, management and analysis of 3D geo-objects with changing shape and attributes in time still is a challenge for geospatial database architectures. In this article we describe the application of concepts for the modeling, management and analysis of 2.5D and 3D spatial plus 1D temporal objects implemented in DB4GeO, our service-oriented geospatial database architecture. An example application with spatio-temporal data of a landfill, near the city of Osnabrück in Germany demonstrates the usage of the concepts. Finally, an outlook on our future research focusing on new applications with big data analysis in three spatial plus one temporal dimension in the United Arab Emirates, especially the Dubai area, is given.

  14. Global migration and the changing distribution of sickle haemoglobin: a quantitative study of temporal trends between 1960 and 2000

    PubMed Central

    Piel, Frédéric B; Tatem, Andrew J; Huang, Zhuojie; Gupta, Sunetra; Williams, Thomas N; Weatherall, David J

    2014-01-01

    Summary Background Changes in the geographical distribution of genetic disorders are often thought to happen slowly, especially when compared with infectious diseases. Whereas mutations, genetic drift, and natural selection take place over many generations, epidemics can spread through large populations within a few days or weeks. Nevertheless, population movements can interfere with these processes, and few studies have been done of their effect on genetic disorders. We aimed to investigate the effect of global migration on the distribution of the sickle-cell gene—the most common and clinically significant haemoglobin structural variant. Methods For each country, we extracted data from the World Bank's Global Bilateral Migration Database about international human migrations between 1960 and 2000. We combined this information with evidence-based estimates of national HbS allele frequencies, generated within a Bayesian geostatistical framework, to analyse temporal changes in the net numbers of migrants, and classified countries with an index summarising these temporal trends. Findings The number of international migrants increased from 92·6 million in 1960, to 165·2 million in 2000. The estimated global number of migrants with HbS increased from about 1·6 million in 1960, to 3·6 million in 2000. This increase was largely due to an increase in the number of migrants from countries with HbS allele frequencies higher than 10%, from 3·1 million in 1960, to 14·2 million in 2000. Additionally, the mean number of countries of origin for each destination country increased from 70 (SE 46) in 1960, to 98 (48) in 2000, showing an increasing diversity in the network of international migrations between countries. Our index of change map shows a patchy distribution of the magnitude of temporal changes, with the highest positive and negative values scattered across all continents. Interpretation Global human population movements have had a substantial effect on the

  15. Silicon photonic microring resonators for quantitative cytokine detection and T-cell secretion analysis.

    PubMed

    Luchansky, Matthew S; Bailey, Ryan C

    2010-03-01

    The ability to perform multiple simultaneous protein biomarker measurements in complex media with picomolar sensitivity presents a large challenge to disease diagnostics and fundamental biological studies. Silicon photonic microring resonators represent a promising platform for real-time detection of biomolecules on account of their spectral sensitivity toward surface binding events between a target and antibody-modified microrings. For all refractive index-based sensing schemes, the mass of bound analytes, in combination with other factors such as antibody affinity and surface density, contributes to the observed signal and measurement sensitivity. Therefore, proteins that are simultaneously low in abundance and have a lower molecular weight are often challenging to detect. By employing a more massive secondary antibody to amplify the signal arising from the initial binding event, it is possible to improve both the sensitivity and the specificity of protein assays, allowing for quantitative sensing in complex sample matrices. Herein, a sandwich assay is used to detect the 15.5 kDa human cytokine interleukin-2 (IL-2) at concentrations down to 100 pg/mL (6.5 pM) and to quantitate unknown solution concentrations over a dynamic range spanning 2.5 orders of magnitude. This same sandwich assay is then used to monitor the temporal secretion profile of IL-2 from Jurkat T lymphocytes in serum-containing cell culture media in the presence of the entire Jurkat secretome. The same temporal secretion analysis is performed in parallel using a commercial ELISA, revealing similar IL-2 concentration profiles but superior precision for the microring resonator sensing platform. Furthermore, we demonstrate the generality of the sandwich assay methodology on the microring resonator platform for the analysis of any biomolecular target for which two high-affinity antibodies exist by detecting the approximately 8 kDa cytokine interleukin-8 (IL-8) with a limit of detection and dynamic

  16. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  17. Bi-temporal analysis of landscape changes in the easternmost mediterranean deltas using binary and classified change information.

    PubMed

    Alphan, Hakan

    2013-03-01

    The aim of this study is (1) to quantify landscape changes in the easternmost Mediterranean deltas using bi-temporal binary change detection approach and (2) to analyze relationships between conservation/management designations and various categories of change that indicate type, degree and severity of human impact. For this purpose, image differencing and ratioing were applied to Landsat TM images of 1984 and 2006. A total of 136 candidate change images including normalized difference vegetation index (NDVI) and principal component analysis (PCA) difference images were tested to understand performance of bi-temporal pre-classification analysis procedures in the Mediterranean delta ecosystems. Results showed that visible image algebra provided high accuracies than did NDVI and PCA differencing. On the other hand, Band 5 differencing had one of the lowest change detection performances. Seven superclasses of change were identified using from/to change categories between the earlier and later dates. These classes were used to understand spatial character of anthropogenic impacts in the study area and derive qualitative and quantitative change information within and outside of the conservation/management areas. Change analysis indicated that natural site and wildlife reserve designations fell short of protecting sand dunes from agricultural expansion in the west. East of the study area, however, was exposed to least human impact owing to the fact that nature conservation status kept human interference at a minimum. Implications of these changes were discussed and solutions were proposed to deal with management problems leading to environmental change. PMID:23307139

  18. Bi-Temporal Analysis of Landscape Changes in the Easternmost Mediterranean Deltas Using Binary and Classified Change Information

    NASA Astrophysics Data System (ADS)

    Alphan, Hakan

    2013-03-01

    The aim of this study is (1) to quantify landscape changes in the easternmost Mediterranean deltas using bi-temporal binary change detection approach and (2) to analyze relationships between conservation/management designations and various categories of change that indicate type, degree and severity of human impact. For this purpose, image differencing and ratioing were applied to Landsat TM images of 1984 and 2006. A total of 136 candidate change images including normalized difference vegetation index (NDVI) and principal component analysis (PCA) difference images were tested to understand performance of bi-temporal pre-classification analysis procedures in the Mediterranean delta ecosystems. Results showed that visible image algebra provided high accuracies than did NDVI and PCA differencing. On the other hand, Band 5 differencing had one of the lowest change detection performances. Seven superclasses of change were identified using from/to change categories between the earlier and later dates. These classes were used to understand spatial character of anthropogenic impacts in the study area and derive qualitative and quantitative change information within and outside of the conservation/management areas. Change analysis indicated that natural site and wildlife reserve designations fell short of protecting sand dunes from agricultural expansion in the west. East of the study area, however, was exposed to least human impact owing to the fact that nature conservation status kept human interference at a minimum. Implications of these changes were discussed and solutions were proposed to deal with management problems leading to environmental change.

  19. Column precipitation chromatography: an approach to quantitative analysis of eigencolloids.

    PubMed

    Breynaert, E; Maes, A

    2005-08-01

    A new column precipitation chromatography (CPC) technique, capable of quantitatively measuring technetium eigencolloids in aqueous solutions, is presented. The CPC technique is based on the destabilization and precipitation of eigencolloids by polycations in a confined matrix. Tc(IV) colloids can be quantitatively determined from their precipitation onto the CPC column (separation step) and their subsequent elution upon oxidation to pertechnetate by peroxide (elution step). A clean-bed particle removal model was used to explain the experimental results. PMID:16053321

  20. Quantitative, Image-Based Phenotyping Methods Provide Insight into Spatial and Temporal Dimensions of Plant Disease1[OPEN

    PubMed Central

    Fentress, Sarah J.; Sher, Joel W.; Berry, Jeffrey C.; Pretz, Chelsea

    2016-01-01

    Plant disease symptoms exhibit complex spatial and temporal patterns that are challenging to quantify. Image-based phenotyping approaches enable multidimensional characterization of host-microbe interactions and are well suited to capture spatial and temporal data that are key to understanding disease progression. We applied image-based methods to investigate cassava bacterial blight, which is caused by the pathogen Xanthomonas axonopodis pv. manihotis (Xam). We generated Xam strains in which individual predicted type III effector (T3E) genes were mutated and applied multiple imaging approaches to investigate the role of these proteins in bacterial virulence. Specifically, we quantified bacterial populations, water-soaking disease symptoms, and pathogen spread from the site of inoculation over time for strains with mutations in avrBs2, xopX, and xopK as compared to wild-type Xam. ∆avrBs2 and ∆xopX both showed reduced growth in planta and delayed spread through the vasculature system of cassava. ∆avrBs2 exhibited reduced water-soaking symptoms at the site of inoculation. In contrast, ∆xopK exhibited enhanced induction of disease symptoms at the site of inoculation but reduced spread through the vasculature. Our results highlight the importance of adopting a multipronged approach to plant disease phenotyping to more fully understand the roles of T3Es in virulence. Finally, we demonstrate that the approaches used in this study can be extended to many host-microbe systems and increase the dimensions of phenotype that can be explored. PMID:27443602

  1. Spatial and temporal analysis of fMRI data on word and sentence reading.

    PubMed

    Haller, Sven; Klarhoefer, Markus; Schwarzbach, Jens; Radue, Ernst W; Indefrey, Peter

    2007-10-01

    Written language comprehension at the word and the sentence level was analysed by the combination of spatial and temporal analysis of functional magnetic resonance imaging (fMRI). Spatial analysis was performed via general linear modelling (GLM). Concerning the temporal analysis, local differences in neurovascular coupling may confound a direct comparison of blood oxygenation level-dependent (BOLD) response estimates between regions. To avoid this problem, we parametrically varied linguistic task demands and compared only task-induced within-region BOLD response differences across areas. We reasoned that, in a hierarchical processing system, increasing task demands at lower processing levels induce delayed onset of higher-level processes in corresponding areas. The flow of activation is thus reflected in the size of task-induced delay increases. We estimated BOLD response delay and duration for each voxel and each participant by fitting a model function to the event-related average BOLD response. The GLM showed increasing activations with increasing linguistic demands dominantly in the left inferior frontal gyrus (IFG) and the left superior temporal gyrus (STG). The combination of spatial and temporal analysis allowed a functional differentiation of IFG subregions involved in written language comprehension. Ventral IFG region (BA 47) and STG subserve earlier processing stages than two dorsal IFG regions (BA 44 and 45). This is in accordance with the assumed early lexical semantic and late syntactic processing of these regions and illustrates the complementary information provided by spatial and temporal fMRI data analysis of the same data set.

  2. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  3. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  4. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Cruikshank, D. P.; Dalle Ore, C. M.; Pendleton, Y. J.; Clark, R. N.

    2012-12-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iapetus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-bys of these satellites, are the C-H stretching modes of aromatic hydrocarbons at ~3.28 μm (~3050 cm-1), and the are four blended bands of aliphatic -CH2- and -CH3 in the range ~3.36-3.52 μm (~2980-2840 cm-1). In these data, the aromatic band, probably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signature among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph ~24; for Hyperion the value is ~12, while Iapetus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 ~2.2 in the spectrum of low-albedo material on Iapetus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  5. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  6. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  7. Quantitative analysis of mycoflora on commercial domestic fruits in Japan.

    PubMed

    Watanabe, Maiko; Tsutsumi, Fumiyuki; Konuma, Rumi; Lee, Ken-Ichi; Kawarada, Kensuke; Sugita-Konishi, Yoshiko; Kumagai, Susumu; Takatori, Kosuke; Konuma, Hirotaka; Hara-Kudo, Yukiko

    2011-09-01

    A comprehensive and quantitative analysis of the mycoflora on the surface of commercial fruit was performed. Nine kinds of fruits grown in Japan were tested. Overall fungal counts on the fruits ranged from 3.1 to 6.5 log CFU/g. The mean percentages of the total yeast counts were higher than those of molds in samples of apples, Japanese pears, and strawberries, ranging from 58.5 to 67.0%, and were lower than those of molds in samples of the other six fruits, ranging from 9.8 to 48.3%. Cladosporium was the most frequent fungus and was found in samples of all nine types of fruits, followed by Penicillium found in eight types of fruits. The fungi with the highest total counts in samples of the various fruits were Acremonium in cantaloupe melons (47.6% of the total fungal count), Aspergillus in grapes (32.2%), Aureobasidium in apples (21.3%), blueberries (63.6%), and peaches (33.6%), Cladosporium in strawberries (38.4%), Cryptococcus in Japanese pears (37.6%), Penicillium in mandarins (22.3%), and Sporobolomyces in lemons (26.9%). These results demonstrated that the mycoflora on the surfaces of these fruits mainly consists of common pre- and postharvest inhabitants of the plants or in the environment; fungi that produce mycotoxins or cause market diseases were not prominent in the mycoflora of healthy fruits. These findings suggest fruits should be handled carefully with consideration given to fungal contaminants, including nonpathogenic fungi, to control the quality of fruits and processed fruit products. PMID:21902918

  8. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  9. Higher-order aggregate networks in the analysis of temporal networks: path structures and centralities

    NASA Astrophysics Data System (ADS)

    Scholtes, Ingo; Wider, Nicolas; Garas, Antonios

    2016-03-01

    Despite recent advances in the study of temporal networks, the analysis of time-stamped network data is still a fundamental challenge. In particular, recent studies have shown that correlations in the ordering of links crucially alter causal topologies of temporal networks, thus invalidating analyses based on static, time-aggregated representations of time-stamped data. These findings not only highlight an important dimension of complexity in temporal networks, but also call for new network-analytic methods suitable to analyze complex systems with time-varying topologies. Addressing this open challenge, here we introduce a novel framework for the study of path-based centralities in temporal networks. Studying betweenness, closeness and reach centrality, we first show than an application of these measures to time-aggregated, static representations of temporal networks yields misleading results about the actual importance of nodes. To overcome this problem, we define path-based centralities in higher-order aggregate networks, a recently proposed generalization of the commonly used static representation of time-stamped data. Using data on six empirical temporal networks, we show that the resulting higher-order measures better capture the true, temporal centralities of nodes. Our results demonstrate that higher-order aggregate networks constitute a powerful abstraction, with broad perspectives for the design of new, computationally efficient data mining techniques for time-stamped relational data.

  10. Automatic quantitative analysis of experimental primary and secondary retinal neurodegeneration: implications for optic neuropathies

    PubMed Central

    Davis, B M; Guo, L; Brenton, J; Langley, L; Normando, E M; Cordeiro, M F

    2016-01-01

    Secondary neurodegeneration is thought to play an important role in the pathology of neurodegenerative disease, which potential therapies may target. However, the quantitative assessment of the degree of secondary neurodegeneration is difficult. The present study describes a novel algorithm from which estimates of primary and secondary degeneration are computed using well-established rodent models of partial optic nerve transection (pONT) and ocular hypertension (OHT). Brn3-labelled retinal ganglion cells (RGCs) were identified in whole-retinal mounts from which RGC density, nearest neighbour distances and regularity indices were determined. The spatial distribution and rate of RGC loss were assessed and the percentage of primary and secondary degeneration in each non-overlapping segment was calculated. Mean RGC number (82 592±681) and RGC density (1695±23.3 RGC/mm2) in naïve eyes were comparable with previous studies, with an average decline in RGC density of 71±17 and 23±5% over the time course of pONT and OHT models, respectively. Spatial analysis revealed greatest RGC loss in the superior and central retina in pONT, but significant RGC loss in the inferior retina from 3 days post model induction. In comparison, there was no significant difference between superior and inferior retina after OHT induction, and RGC loss occurred mainly along the superior/inferior axis (~30%) versus the nasal–temporal axis (~15%). Intriguingly, a significant loss of RGCs was also observed in contralateral eyes in experimental OHT. In conclusion, a novel algorithm to automatically segment Brn3a-labelled retinal whole-mounts into non-overlapping segments is described, which enables automated spatial and temporal segmentation of RGCs, revealing heterogeneity in the spatial distribution of primary and secondary degenerative processes. This method provides an attractive means to rapidly determine the efficacy of neuroprotective therapies with implications for any

  11. Analysis of SWOT spatial and temporal samplings over continents

    NASA Astrophysics Data System (ADS)

    Biancamaria, Sylvain; Lamy, Alain; Mognard, Nelly

    2014-05-01

    The future Surface Water and Ocean Topography (SWOT) satellite mission, collaboratively developed by NASA, CNES and CSA, is a joint oceanography/continental hydrology mission planned for launch in 2020. In June 2013, a new SWOT orbit has been selected with a 77.6° inclination, a 21 days repeat cycle and a 891 km altitude. The main satellite payload (a Ka-band SAR Interferometer), will provide 2D maps of water elevation, mask and slope over two swaths, both having a 50 km extent. These two swaths will be separated by a 20 km nadir gap. Most of the studies concerning SWOT published since 2007 have considered a former orbit with a 78° inclination, 22 day repeat orbit and a 970 km altitude and a 60 km extent for each swath. None of them have studied the newly selected orbit and the impact of the 20 km nadir gap on the spatial coverage has not been much explored. The purpose of the work presented here is to investigate the spatial and temporal coverage given this new orbit and the actual swath extent (2*50 km swaths with the 20 km nadir gap in between) and compare it to the former SWOT configuration. It is shown that the new configuration will have almost no impact on the computation of monthly averages, however it will impact the spatial coverage. Because of the nadir gap, the orbit repeatitivity and the swaths extent, 3.6% of the continental surfaces in between 78°S and 78°N will never be observed by SWOT (which was previously equal to 2.2% with the former SWOT configuration). The equatorial regions will be the most impacted, as uncovered area could go up to ~14% locally, whereas it never exceeded 9% with the previous SWOT configuration.

  12. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  13. Large-scale temporal analysis of computer and information science

    NASA Astrophysics Data System (ADS)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  14. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  15. Hydrological drought types in cold climates: quantitative analysis of causing factors and qualitative survey of impacts

    NASA Astrophysics Data System (ADS)

    Van Loon, A. F.; Ploum, S. W.; Parajka, J.; Fleig, A. K.; Garnier, E.; Laaha, G.; Van Lanen, H. A. J.

    2015-04-01

    For drought management and prediction, knowledge of causing factors and socio-economic impacts of hydrological droughts is crucial. Propagation of meteorological conditions in the hydrological cycle results in different hydrological drought types that require separate analysis. In addition to the existing hydrological drought typology, we here define two new drought types related to snow and ice. A snowmelt drought is a deficiency in the snowmelt discharge peak in spring in snow-influenced basins and a glaciermelt drought is a deficiency in the glaciermelt discharge peak in summer in glacierised basins. In 21 catchments in Austria and Norway we studied the meteorological conditions in the seasons preceding and at the time of snowmelt and glaciermelt drought events. Snowmelt droughts in Norway were mainly controlled by below-average winter precipitation, while in Austria both temperature and precipitation played a role. For glaciermelt droughts, the effect of below-average summer air temperature was dominant, both in Austria and Norway. Subsequently, we investigated the impacts of temperature-related drought types (i.e. snowmelt and glaciermelt drought, but also cold and warm snow season drought and rain-to-snow-season drought). In historical archives and drought databases for the US and Europe many impacts were found that can be attributed to these temperature-related hydrological drought types, mainly in the agriculture and electricity production (hydropower) sectors. However, drawing conclusions on the frequency of occurrence of different drought types from reported impacts is difficult, mainly because of reporting biases and the inevitably limited spatial and temporal scales of the information. Finally, this study shows that complete integration of quantitative analysis of causing factors and qualitative analysis of impacts of temperature-related droughts is not yet possible. Analysis of selected events, however, points out that it can be a promising research

  16. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  17. Temporal Land Cover Analysis for Net Ecosystem Improvement

    SciTech Connect

    Ke, Yinghai; Coleman, Andre M.; Diefenderfer, Heida L.

    2013-04-09

    We delineated 8 watersheds contributing to previously defined river reaches within the 1,468-km2 historical floodplain of the tidally influenced lower Columbia River and estuary. We assessed land-cover change at the watershed, reach, and restoration site scales by reclassifying remote-sensing data from the National Oceanic and Atmospheric Administration Coastal Change Analysis Program’s land cover/land change product into forest, wetland, and urban categories. The analysis showed a 198.3 km2 loss of forest cover during the first 6 years of the Columbia Estuary Ecosystem Restoration Program, 2001–2006. Total measured urbanization in the contributing watersheds of the estuary during the full 1996-2006 change analysis period was 48.4 km2. Trends in forest gain/loss and urbanization differed between watersheds. Wetland gains and losses were within the margin of error of the satellite imagery analysis. No significant land cover change was measured at restoration sites, although it was visible in aerial imagery, therefore, the 30-m land-cover product may not be appropriate for assessment of early-stage wetland restoration. These findings suggest that floodplain restoration sites in reaches downstream of watersheds with decreasing forest cover will be subject to increased sediment loads, and those downstream of urbanization will experience effects of increased impervious surfaces on hydrologic processes.

  18. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  19. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications.

  20. Towards a temporal network analysis of interactive WiFi users

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Wang, Lin; Zhang, Yi-Qing; Li, Xiang

    2012-06-01

    Complex networks are used to depict topological features of complex systems. The structure of a network characterizes the interactions among elements of the system, and facilitates the study of many dynamical processes taking place on it. In previous investigations, the topological infrastructure underlying dynamical systems is simplified as a static and invariable skeleton. However, this assumption cannot cover the temporal features of many time-evolution networks, whose components are evolving and mutating. In this letter, utilizing the log data of WiFi users in a Chinese university campus, we infuse the temporal dimension into the construction of dynamical human contact network. By quantitative comparison with the traditional aggregation approach, we find that the temporal contact network differs in many features, e.g., the reachability, the path length distribution. We conclude that the correlation between temporal path length and duration is not only determined by their definitions, but also influenced by the micro-dynamical features of human activities under certain social circumstance as well. The time order of individuals' interaction events plays a critical role in understanding many dynamical processes via human close proximity interactions studied in this letter. Besides, our study also provides a promising measure to identify the potential superspreaders by distinguishing the nodes functioning as the relay hub. The first two authors contributed equally to this paper.

  1. Quantitative Analysis of Autophagy using Advanced 3D Fluorescence Microscopy

    PubMed Central

    Changou, Chun A.; Wolfson, Deanna L.; Ahluwalia, Balpreet Singh; Bold, Richard J.; Kung, Hsing-Jien; Chuang, Frank Y.S.

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine1. This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)1,10. Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)1,2,3. Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation4,5. Although the essential components of this pathway are well-characterized6,7,8,9, many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy11,12. Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early stages of

  2. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  3. A diagnostic programme for quantitative analysis of proteinuria.

    PubMed

    Hofmann, W; Guder, W G

    1989-09-01

    A spectrum of quantitative methods was adapted to the Kone Specific Analyser for the purpose of recognizing, quantifying and differentiating various forms of proteinuria. Total protein, IgG, albumin and alpha 1-microglobulin (measured by turbidimetry), N-acetyl-beta-D-glucosaminidase activity and creatinine (measured photometrically), were measured in undiluted urine; in addition alpha 1-microglobulin was measured in serum. Within and between run precision, accuracy and linearity of the turbidimetric methods were in good agreement with nephelometric procedures. All turbidimetric methods exhibited a correlation coefficient r greater than 0.98 when compared with the radial immunodiffusion procedure as reference method. Total protein measured turbidimetrically with the Kone Specific Analyser was in good agreement with the manual biuret procedure. The low detection limits and linearities allowed quantification of urine analytes from the lower range of normals up to ten times the upper limit of normals. The measured analytes exhibited stability in urine at pH 4-8 over at least seven days at 4-6 degrees C and -20 degrees C. Only IgG showed a significant loss (up to 30 percent), when measured after storage at -20 degrees C. Quantities per mol creatinine showed significantly lower intra-individual and inter-individual variability than quantities per liter. In 31 normal persons, the intraindividual variation was lowest for N-acetyl-beta-D-glucosaminidase activity (13%) and highest for total protein (33%), when measured in the second morning urine on 5 consecutive days. When related to creatinine, results obtained in the second morning urine showed no significant differences from those in 24 h urine, except for alpha 1-microglobulin which gave lower values in 24 h urines. The upper normal limits, calculated as the 95% ranges, were determined from 154 urines of 31 individuals. Nearly all analytes showed an asymmetric distribution. Because of a wide tailing of the upper limit

  4. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  5. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    PubMed

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  6. Statistical methods for temporal and space-time analysis of community composition data.

    PubMed

    Legendre, Pierre; Gauthier, Olivier

    2014-03-01

    This review focuses on the analysis of temporal beta diversity, which is the variation in community composition along time in a study area. Temporal beta diversity is measured by the variance of the multivariate community composition time series and that variance can be partitioned using appropriate statistical methods. Some of these methods are classical, such as simple or canonical ordination, whereas others are recent, including the methods of temporal eigenfunction analysis developed for multiscale exploration (i.e. addressing several scales of variation) of univariate or multivariate response data, reviewed, to our knowledge for the first time in this review. These methods are illustrated with ecological data from 13 years of benthic surveys in Chesapeake Bay, USA. The following methods are applied to the Chesapeake data: distance-based Moran's eigenvector maps, asymmetric eigenvector maps, scalogram, variation partitioning, multivariate correlogram, multivariate regression tree, and two-way MANOVA to study temporal and space-time variability. Local (temporal) contributions to beta diversity (LCBD indices) are computed and analysed graphically and by regression against environmental variables, and the role of species in determining the LCBD values is analysed by correlation analysis. A tutorial detailing the analyses in the R language is provided in an appendix. PMID:24430848

  7. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  8. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  9. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  10. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  11. Analysis of the temporal properties in car accident time series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele

    2008-05-01

    In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.

  12. Ictal Spread of Medial Temporal Lobe Seizures With and Without Secondary Generalization: An Intracranial EEG Analysis

    PubMed Central

    Yoo, Ji Yeoun; Farooque, Pue; Chen, William; Youngblood, Mark W.; Zaveri, Hitten P.; Gerrard, Jason L.; Spencer, Dennis D.; Hirsch, Lawrence J.; Blumenfeld, Hal

    2013-01-01

    Summary Objective Secondary generalization of seizures has devastating consequences for patient safety and quality of life. The aim of this intracranial EEG (icEEG) study was to investigate the differences in onset and propagation patterns of temporal lobe seizures that remained focal vs. those with secondary generalization in order to better understand the mechanism of secondary generalization. Methods A total of 39 seizures were analyzed in 9 patients who met the following criteria: 1) icEEG-video monitoring with at least 1 secondarily generalized tonic clonic seizure (GTC), 2) pathologically proven hippocampal sclerosis, and 3) no seizures for at least 1 year after anteromedial temporal lobe resection. Seizures were classified as focal or secondary generalized by behavioral analysis of video. Onset and propagation patterns were compared by analysis of icEEG. Results We obtained data from 22 focal seizures without generalization (FS), and 17 GTC. Seizure onset patterns did not differ between FS and GTCs, but there were differences in later propagation. All seizures started with low voltage fast activity except 7 seizures in one patient (6 FS, 1 GTC), which started with sharply contoured theta activity. 15 of 39 seizures started from the hippocampus and 24 seizures (including 6 seizures in a patient without hippocampal contacts) started from other medial temporal lobe areas. We observed involvement or more prominent activation of the posterior-lateral temporal regions in GTCs prior to propagation to the other cortical regions, vs. FS which had no involvement or less prominent activation of the posterior lateral temporal cortex. Occipital contacts were not involved at the time of clinical secondary generalization. Significance The posterior-lateral temporal cortex may serve as an important “gateway” controlling propagation of medial temporal lobe seizures to other cortical regions. Identifying the mechanisms of secondary generalization of focal seizures may

  13. Quantitative and qualitative HPLC analysis of thermogenic weight loss products.

    PubMed

    Schaneberg, B T; Khan, I A

    2004-11-01

    An HPLC qualitative and quantitative method of seven analytes (caffeine, ephedrine, forskolin, icariin, pseudoephedrine, synephrine, and yohimbine) in thermogenic weight loss preparations available on the market is described in this paper. After 45 min the seven analytes were separated and detected in the acetonitrile: water (80:20) extract. The method uses a Waters XTerra RP18 (5 microm particle size) column as the stationary phase, a gradient mobile phase of water (5.0 mM SDS) and acetonitrile, and a UV detection of 210 nm. The correlation coefficients for the calibration curves and the recovery rates ranged from 0.994 to 0.999 and from 97.45% to 101.05%, respectively. The qualitative and quantitative results are discussed. PMID:15587578

  14. Quantitative sectioning and noise analysis for structured illumination microscopy

    PubMed Central

    Hagen, Nathan; Gao, Liang; Tkaczyk, Tomasz S.

    2011-01-01

    Structured illumination (SI) has long been regarded as a nonquantitative technique for obtaining sectioned microscopic images. Its lack of quantitative results has restricted the use of SI sectioning to qualitative imaging experiments, and has also limited researchers’ ability to compare SI against competing sectioning methods such as confocal microscopy. We show how to modify the standard SI sectioning algorithm to make the technique quantitative, and provide formulas for calculating the noise in the sectioned images. The results indicate that, for an illumination source providing the same spatially-integrated photon flux at the object plane, and for the same effective slice thicknesses, SI sectioning can provide higher SNR images than confocal microscopy for an equivalent setup when the modulation contrast exceeds about 0.09. PMID:22274364

  15. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-11-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, objective mapping procedures based on quantitative cytoarchitecture have been generated. As a result, new maps for various species including man were established. In our contribution, principles of quantitative cytoarchitecture and algorithm-based cortical mapping are described for a cytoarchitectural parcellation of the human auditory cortex. Defining cortical borders based on quantified changes in cortical lamination is the decisive step towards a novel, highly improved probabilistic brain atlas.

  16. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    NASA Astrophysics Data System (ADS)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  17. Quantitative analysis of the human T cell palmitome

    PubMed Central

    Morrison, Eliot; Kuropka, Benno; Kliche, Stefanie; Brügger, Britta; Krause, Eberhard; Freund, Christian

    2015-01-01

    Palmitoylation is a reversible post-translational modification used to inducibly compartmentalize proteins in cellular membranes, affecting the function of receptors and intracellular signaling proteins. The identification of protein “palmitomes” in several cell lines raises the question to what extent this modification is conserved in primary cells. Here we use primary T cells with acyl-biotin exchange and quantitative mass spectrometry to identify a pool of proteins previously unreported as palmitoylated in vivo. PMID:26111759

  18. Comprehensive objective maps of macromolecular conformations by quantitative SAXS analysis

    PubMed Central

    Hura, Greg L.; Budworth, Helen; Dyer, Kevin N.; Rambo, Robert P.; Hammel, Michal

    2013-01-01

    Comprehensive perspectives of macromolecular conformations are required to connect structure to biology. Here we present a small angle X-ray scattering (SAXS) Structural Similarity Map (SSM) and Volatility of Ratio (VR) metric providing comprehensive, quantitative and objective (superposition-independent) perspectives on solution state conformations. We validate VR and SSM utility on human MutSβ, a key ABC ATPase and chemotherapeutic target, by revealing MutSβ DNA sculpting and identifying multiple conformational states for biological activity. PMID:23624664

  19. Fluorescent microscopy approaches of quantitative soil microbial analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  20. Exploratory Analysis of Spatial-Temporal Patterns of Air Pollution in the City

    NASA Astrophysics Data System (ADS)

    Champendal, Alexandre; Kanevski, Mikhail; Huguenot, Pierre-Emmanuel; Golay, Jean

    2013-04-01

    Air pollution in the city is an important problem influencing environment, well-being of society, economy, management of urban zones, etc. The problem is extremely difficult due to a very complex distribution of the pollution sources, morphology of the city and dispersion processes leading to multivariate nature of the phenomena and high local spatial-temporal variability. The task of understanding, modelling and prediction of spatial-temporal patterns of air pollution in urban zones is an interesting and challenging topic having many research axes from science-based modelling to geostatistics and data mining. The present research mainly deals with a comprehensive exploratory analysis of spatial-temporal air pollution data using statistical, geostatistical and machine learning tools. This analysis helps to 1) understand and model spatial-temporal correlations using variography, 2) explore the temporal evolution of spatial correlation matrix; 3) analyse and visualize an interconnection between measurement stations using network science tools; 4) quantify the availability and predictability of structured patterns. The real data case study deals with spatial-temporal air pollution data of canton Geneva (2002-2011). Carbon dioxide (NO2) have caught our attention. It has effects on health: nitrogen dioxide can irritate the lungs, effects on plants; NO2 contributes to the phenomenon of acid rain. The negative effects of nitrogen dioxides on plants are reducing the growth, production and pesticide resistance. And finally the effects on materials: nitrogen dioxides increase the corrosion. Well-defined patterns of spatial-temporal correlations were detected. The analysis and visualization of spatial correlation matrix for 91 stations were carried out using the network science tools and high levels of clustering were revealed. Moving Window Correlation Matrix and Spatio-temporal variography methods were applied to define and explore the dynamic of our data. More than just

  1. Temporal analysis of social networks using three-way DEDICOM.

    SciTech Connect

    Bader, Brett William; Harshman, Richard A. (University of Ontario, London, Ontario, Canada); Kolda, Tamara Gibson

    2006-06-01

    DEDICOM is an algebraic model for analyzing intrinsically asymmetric relationships, such as the balance of trade among nations or the flow of information among organizations or individuals. It provides information on latent components in the data that can be regarded as ''properties'' or ''aspects'' of the objects, and it finds a few patterns that can be combined to describe many relationships among these components. When we apply this technique to adjacency matrices arising from directed graphs, we obtain a smaller graph that gives an idealized description of its patterns. Three-way DEDICOM is a higher-order extension of the model that has certain uniqueness properties. It allows for a third mode of the data, such as time, and permits the analysis of semantic graphs. We present an improved algorithm for computing three-way DEDICOM on sparse data and demonstrate it by applying it to the adjacency tensor of a semantic graph with time-labeled edges. Our application uses the Enron email corpus, from which we construct a semantic graph corresponding to email exchanges among Enron personnel over a series of 44 months. Meaningful patterns are recovered in which the representation of asymmetries adds insight into the social networks at Enron.

  2. Temporal Sequence of Hemispheric Network Activation during Semantic Processing: A Functional Network Connectivity Analysis

    ERIC Educational Resources Information Center

    Assaf, Michal; Jagannathan, Kanchana; Calhoun, Vince; Kraut, Michael; Hart, John, Jr.; Pearlson, Godfrey

    2009-01-01

    To explore the temporal sequence of, and the relationship between, the left and right hemispheres (LH and RH) during semantic memory (SM) processing we identified the neural networks involved in the performance of functional MRI semantic object retrieval task (SORT) using group independent component analysis (ICA) in 47 healthy individuals. SORT…

  3. Spectral and Temporal Analysis of Terrestrial Gamma-Ray Flashes

    NASA Astrophysics Data System (ADS)

    Cramer, E. S.; Dwyer, J. R.; Rassoul, H. K.; Fishman, G. J.; Bhat, N. P.; Briggs, M. S.; Chaplin, V. L.; Connaughton, V.; Fitzpatrick, G.

    2012-12-01

    Terrestrial Gamma-Ray Flashes have been well studied both theoretically and by space based instruments. One of the main mysteries about TGFs is the variation in arrival times between the main "hard" pulse and the "soft" Compton tail. It is well known that TGFs may display symmetric and asymmetric time histories; however, little is known about what intrinsic property of the source allows it to behave the way it does. In this study, we use full Monte Carlo simulations, developed by Dwyer at Florida Tech, to characterize the behavior of the TGF timing with respect to atmospheric depth of the source and the TGF source location with respect to the GBM satellite footprint. The simulation generates the full relativistic runaway electron avalanches and propagates the resulting bremsstrahlung photons through the atmosphere and out to satellite altitude. In this analysis, we use the relativistic runaway electron avalanche model to show some of the clear distinctions between the low energy (< 300 keV) and high energy (> 300 keV) components of the TGF. The Gamma-Ray Burst Monitor aboard the Fermi Gamma-Ray Space Telescope has detected TGFs as short as 50 microseconds. Here, we model those results with the Monte Carlo by simulating different source geometries and heights. We also compare other models in the field to our full Monte Carlo results, and attempt to explain the differences. Therefore, the overall goal of this study is to obtain constraints on the parameter space of TGFs in order to learn something physical about individual events.

  4. Quantitative and dynamic analysis of PTEN phosphorylation by NMR.

    PubMed

    Cordier, Florence; Chaffotte, Alain; Wolff, Nicolas

    2015-05-01

    The dual lipid and protein phosphatase PTEN is a tumor suppressor controlling key biological processes, such as cell growth, proliferation and neuro-survival. Its activity and intracellular trafficking is finely regulated notably by multi-site phosphorylation of its C-terminal tail. The reversible and highly dynamic character of these regulatory events confers a temporal dimension to the cell for triggering crucial decisions. In this review, we describe how a recently developed time-resolved NMR spectroscopy approach unveils the dynamic establishment of the phosphorylation events of PTEN C-terminal tail controlled by CK2 and GSK3β kinases. Two cascades of reactions have been identified, in vitro and in extracts of human neuroblastoma cells. They are triggered independently on two nearby clusters of sites (S380-S385 and S361-S370) and occur on different timescales. In each cascade, the reactions follow an ordered model with a distributive kinetic mechanism. The vision of these cascades as two delay timers activating distinct or time-delayed regulatory responses gives a temporal dimension on PTEN regulation and is discussed in relation to the known functional roles of each cluster. PMID:25449899

  5. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  6. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    PubMed

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  7. Temporal Variations of Water Productivity in Irrigated Corn: An Analysis of Factors Influencing Yield and Water Use across Central Nebraska

    PubMed Central

    Carr, Tony; Yang, Haishun; Ray, Chittaranjan

    2016-01-01

    Water Productivity (WP) of a crop defines the relationship between the economic or physical yield of the crop and its water use. With this concept it is possible to identify disproportionate water use or water-limited yield gaps and thereby support improvements in agricultural water management. However, too often important qualitative and quantitative environmental factors are not part of a WP analysis and therefore neglect the aspect of maintaining a sustainable agricultural system. In this study, we examine both the physical and economic WP in perspective with temporally changing environmental conditions. The physical WP analysis was performed by comparing simulated maximum attainable corn yields per unit of water using the crop model Hybrid-Maize with observed data from 2005 through 2013 from 108 farm plots in the Central Platte and the Tri Basin Natural Resource Districts of Nebraska. In order to expand the WP analysis on external factors influencing yields, a second model, Maize-N, was used to estimate optimal nitrogen (N)–fertilizer rate for specific fields in the study area. Finally, a vadose zone flow and transport model, HYDRUS-1D for simulating vertical nutrient transport in the soil, was used to estimate locations of nitrogen pulses in the soil profile. The comparison of simulated and observed data revealed that WP was not on an optimal level, mainly due to large amounts of irrigation used in the study area. The further analysis illustrated year-to-year variations of WP during the nine consecutive years, as well as the need to improve fertilizer management to favor WP and environmental quality. In addition, we addressed the negative influence of groundwater depletion on the economic WP through increasing pumping costs. In summary, this study demonstrated that involving temporal variations of WP as well as associated environmental and economic issues can represent a bigger picture of WP that can help to create incentives to sustainably improve

  8. Temporal Variations of Water Productivity in Irrigated Corn: An Analysis of Factors Influencing Yield and Water Use across Central Nebraska.

    PubMed

    Carr, Tony; Yang, Haishun; Ray, Chittaranjan

    2016-01-01

    Water Productivity (WP) of a crop defines the relationship between the economic or physical yield of the crop and its water use. With this concept it is possible to identify disproportionate water use or water-limited yield gaps and thereby support improvements in agricultural water management. However, too often important qualitative and quantitative environmental factors are not part of a WP analysis and therefore neglect the aspect of maintaining a sustainable agricultural system. In this study, we examine both the physical and economic WP in perspective with temporally changing environmental conditions. The physical WP analysis was performed by comparing simulated maximum attainable corn yields per unit of water using the crop model Hybrid-Maize with observed data from 2005 through 2013 from 108 farm plots in the Central Platte and the Tri Basin Natural Resource Districts of Nebraska. In order to expand the WP analysis on external factors influencing yields, a second model, Maize-N, was used to estimate optimal nitrogen (N)-fertilizer rate for specific fields in the study area. Finally, a vadose zone flow and transport model, HYDRUS-1D for simulating vertical nutrient transport in the soil, was used to estimate locations of nitrogen pulses in the soil profile. The comparison of simulated and observed data revealed that WP was not on an optimal level, mainly due to large amounts of irrigation used in the study area. The further analysis illustrated year-to-year variations of WP during the nine consecutive years, as well as the need to improve fertilizer management to favor WP and environmental quality. In addition, we addressed the negative influence of groundwater depletion on the economic WP through increasing pumping costs. In summary, this study demonstrated that involving temporal variations of WP as well as associated environmental and economic issues can represent a bigger picture of WP that can help to create incentives to sustainably improve

  9. Temporal Variations of Water Productivity in Irrigated Corn: An Analysis of Factors Influencing Yield and Water Use across Central Nebraska.

    PubMed

    Carr, Tony; Yang, Haishun; Ray, Chittaranjan

    2016-01-01

    Water Productivity (WP) of a crop defines the relationship between the economic or physical yield of the crop and its water use. With this concept it is possible to identify disproportionate water use or water-limited yield gaps and thereby support improvements in agricultural water management. However, too often important qualitative and quantitative environmental factors are not part of a WP analysis and therefore neglect the aspect of maintaining a sustainable agricultural system. In this study, we examine both the physical and economic WP in perspective with temporally changing environmental conditions. The physical WP analysis was performed by comparing simulated maximum attainable corn yields per unit of water using the crop model Hybrid-Maize with observed data from 2005 through 2013 from 108 farm plots in the Central Platte and the Tri Basin Natural Resource Districts of Nebraska. In order to expand the WP analysis on external factors influencing yields, a second model, Maize-N, was used to estimate optimal nitrogen (N)-fertilizer rate for specific fields in the study area. Finally, a vadose zone flow and transport model, HYDRUS-1D for simulating vertical nutrient transport in the soil, was used to estimate locations of nitrogen pulses in the soil profile. The comparison of simulated and observed data revealed that WP was not on an optimal level, mainly due to large amounts of irrigation used in the study area. The further analysis illustrated year-to-year variations of WP during the nine consecutive years, as well as the need to improve fertilizer management to favor WP and environmental quality. In addition, we addressed the negative influence of groundwater depletion on the economic WP through increasing pumping costs. In summary, this study demonstrated that involving temporal variations of WP as well as associated environmental and economic issues can represent a bigger picture of WP that can help to create incentives to sustainably improve

  10. Quantitative electroencephalography spectral analysis and topographic mapping in a rat model of middle cerebral artery occlusion.

    PubMed

    Lu, X C; Williams, A J; Tortella, F C

    2001-12-01

    Electroencephalography (EEG) has a long history in clinical evaluations of cerebrovascular disease. Distinct EEG abnormalities, such as increased slow delta activity, voltage depression and epileptiform discharge, have been identified in stroke patients. However, preclinical use of EEG analysis of cerebral ischaemia is less documented. We report a new rat model of EEG topographic mapping during permanent and transient middle cerebral artery occlusion. Ten EEG electrodes were implanted on the rat skull, symmetrically covering the cortical regions of two hemispheres. Monopolar EEG recordings were acquired from each animal at multiple time points during the initial 24 h, and again once daily for 7 days. Traditional EEG examinations, quantitative EEG (qEEG) spectral analysis and topographic EEG mapping were employed for comprehensive data analyses. Several distinct spatiotemporal EEG abnormalities were identified in the ischaemic rat brain. In the ipsilateral hemisphere, pronounced increase in delta activity was observed in each recorded area within 24 h of injury. While sharp waves and spike complexes dominated the parietal region, a nearly isoelectric EEG state was seen in the temporal region. After 48 h, spontaneous, albeit incomplete, recovery of EEG activities developed in all rats. Reperfusion appeared to promote delta and alpha recovery more efficiently. The contralateral EEG changes were also recorded in two phases: an acute moderate increase in delta activities with intermittent rhythmic activities, followed by a delayed and significant increase in beta activities across the hemisphere. The similarities of rat qEEG profiles identified in this study to that of stroke patients and the application of topographic mapping broaden our research technology for preclinical experimental studies of brain injury.

  11. Quantitative two-process analysis of avoidance conditioning in goldfish.

    PubMed

    Zhuikov, A Y; Couvillon, P A; Bitterman, M E

    1994-01-01

    The shuttlebox performance of goldfish was studied under standardized conditions in a variety of problems--with or without an avoidance contingency, a conditioned stimulus (CS)-termination contingency, and an escape contingency. The effects of CS-only, unconditioned stimulus (US)-only, and explicitly unpaired training were also examined. All the data could be simulated quantitatively with a version of O. H. Mowrer's (1947) 2-process theory expressed in 2 learning equations (1 classical, the other instrumental) and a performance equation. The good fit suggests that the theory is worth developing further with new experiments designed to challenge it.

  12. Design and performance analysis of high-order optical temporal differentiator with twin-core fiber

    NASA Astrophysics Data System (ADS)

    You, Haidong; Ning, Tigang; Li, Jing; Jian, Wei; Wen, Xiaodong; Pei, Li

    2013-08-01

    A simple and general approach for implementing all-fiber high-order optical temporal differentiator based on twin-core fiber (TCF) is presented and demonstrated. Specifically, the core 2 (or core 1) of the TCF should be cut in N sections with the same length for achieving N'th-order optical temporal differentiator, which can be considered to consist of N cascaded first-order optical temporal differentiators based on TCF. Our simulations show that the proposed approach can provide optical operation bandwidths in the several THz regime, which is capable of accurately processing time features as short as subpicoseconds. Performance analysis results show a good accuracy calculating the high-order time differentiation of the optical signal launched at core 2 (or core 1).

  13. Fractal dimension-bound spatio-temporal analysis of digital mammograms

    NASA Astrophysics Data System (ADS)

    Shanmugavadivu, P.; Sivakumar, V.; Sudhir, Rashmi

    2016-02-01

    A new Fractal Dimension-based diagnosis technique for the change detection and time-series analysis of masses in the temporal digital mammogram is presented in this paper. As the digital mammograms are confirmed as a reliable source for the prognosis of breast cancer, the demand for the development of precise computer aided detection techniques is constantly on the increase. This formed the basis for the development of this method using Fractal geometry, which is an efficient mathematical approach that deals with self-similar and irregular geometric objects called fractals. This work comprises of the detection of spatial masses using Fractal Hurst bound enhancement and segmentation of those temporal masses using Fractal Thresholding. The consultant radiologist's assessment of mass lesions forms the baseline for comparison and validation of the detected masses. Further, this research work performs temporal analysis of mass lesions, detected from the mammograms of the current and the respective prior view using the principle of Fractal Dimension. The precision of Fractal-dimension based temporal texture analysis of malignant masses of digital mammograms subsequently attributes to their characterization.

  14. An Adaptive Organization Method of Geovideo Data for Spatio-Temporal Association Analysis

    NASA Astrophysics Data System (ADS)

    Wu, C.; Zhu, Q.; Zhang, Y. T.; Du, Z. Q.; Zhou, Y.; Xie, X.; He, F.

    2015-07-01

    Public security incidents have been increasingly challenging to address with their new features, including large-scale mobility, multi-stage dynamic evolution, spatio-temporal concurrency and uncertainty in the complex urban environment, which require spatio-temporal association analysis among multiple regional video data for global cognition. However, the existing video data organizational methods that view video as a property of the spatial object or position in space dissever the spatio-temporal relationship of scattered video shots captured from multiple video channels, limit the query functions on interactive retrieval between a camera and its video clips and hinder the comprehensive management of event-related scattered video shots. GeoVideo, which maps video frames onto a geographic space, is a new approach to represent the geographic world, promote security monitoring in a spatial perspective and provide a highly feasible solution to this problem. This paper analyzes the large-scale personnel mobility in public safety events and proposes a multi-level, event-related organization method with massive GeoVideo data by spatio-temporal trajectory. This paper designs a unified object identify(ID) structure to implicitly store the spatio-temporal relationship of scattered video clips and support the distributed storage management of massive cases. Finally, the validity and feasibility of this method are demonstrated through suspect tracking experiments.

  15. Quantitative analysis of laminin 5 gene expression in human keratinocytes.

    PubMed

    Akutsu, Nobuko; Amano, Satoshi; Nishiyama, Toshio

    2005-05-01

    To examine the expression of laminin 5 genes (LAMA3, LAMB3, and LAMC2) encoding the three polypeptide chains alpha3, beta3, and gamma2, respectively, in human keratinocytes, we developed novel quantitative polymerase chain reaction (PCR) methods utilizing Thermus aquaticus DNA polymerase, specific primers, and fluorescein-labeled probes with the ABI PRISM 7700 sequence detector system. Gene expression levels of LAMA3, LAMB3, and LAMC2 and glyceraldehyde-3-phosphate dehydrogenase were quantitated reproducibly and sensitively in the range from 1 x 10(2) to 1 x 10(8) gene copies. Basal gene expression level of LAMB3 was about one-tenth of that of LAMA3 or LAMC2 in human keratinocytes, although there was no clear difference among immunoprecipitated protein levels of alpha3, beta3, and gamma2 synthesized in radio-labeled keratinocytes. Human serum augmented gene expressions of LAMA3, LAMB3, and LAMC2 in human keratinocytes to almost the same extent, and this was associated with an increase of the laminin 5 protein content measured by a specific sandwich enzyme-linked immunosorbent assay. These results demonstrate that the absolute mRNA levels generated from the laminin 5 genes do not determine the translated protein levels of the laminin 5 chains in keratinocytes, and indicate that the expression of the laminin 5 genes may be controlled by common regulation mechanisms. PMID:15854126

  16. Quantitative phenotypic analysis of multistress response in Zygosaccharomyces rouxii complex.

    PubMed

    Solieri, Lisa; Dakal, Tikam C; Bicciato, Silvio

    2014-06-01

    Zygosaccharomyces rouxii complex comprises three yeasts clusters sourced from sugar- and salt-rich environments: haploid Zygosaccharomyces rouxii, diploid Zygosaccharomyces sapae and allodiploid/aneuploid strains of uncertain taxonomic affiliations. These yeasts have been characterized with respect to gene copy number variation, karyotype variability and change in ploidy, but functional diversity in stress responses has not been explored yet. Here, we quantitatively analysed the stress response variation in seven strains of the Z. rouxii complex by modelling growth variables via model and model-free fitting methods. Based on the spline fit as most reliable modelling method, we resolved different interstrain responses to 15 environmental perturbations. Compared with Z. rouxii CBS 732(T) and Z. sapae strains ABT301(T) and ABT601, allodiploid strain ATCC 42981 and aneuploid strains CBS 4837 and CBS 4838 displayed higher multistress resistance and better performance in glycerol respiration even in the presence of copper. μ-based logarithmic phenotypic index highlighted that ABT601 is a slow-growing strain insensitive to stress, whereas ABT301(T) grows fast on rich medium and is sensitive to suboptimal conditions. Overall, the differences in stress response could imply different adaptation mechanisms to sugar- and salt-rich niches. The obtained phenotypic profiling contributes to provide quantitative insights for elucidating the adaptive mechanisms to stress in halo- and osmo-tolerant Zygosaccharomyces yeasts. PMID:24533625

  17. Qualitative and quantitative analysis of volatile constituents from latrines.

    PubMed

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system. PMID:23829328

  18. Quantitative analysis of radiation-induced changes in sperm morphology

    SciTech Connect

    Young, I.T.; Gledhill, B.L.; Lake, S.; Wyrobek, A.J.

    1982-09-01

    When developing spermatogenic cells are exposed to radiation, chemical carcinogens or mutagens, the transformation in the morphology of the mature sperm can be used to determine the severity of the exposure. In this study five groups of mice with three mice per group received testicular doses of X irradiation at dosage levels ranging from 0 rad to 120 rad. A random sample of 100 mature sperm per mouse was analyzed five weeks later for the quantitative morphologic transformation as a function of dosage level. The cells were stained with gallocyanin chrome alum (GCA) so that only the DNA in the sperm head was visible. The ACUity quantitative microscopy system at Lawrence Livermore National Laboratory was used to scan the sperm at a sampling density of 16 points per linear micrometer and with 256 brightness levels per point. The contour of each cell was extracted using conventional thresholding techniques on the high-contrast images. For each contour a variety of shape features was then computed to characterize the morphology of that cell. Using the control group and the distribution of their shape features to establish the variability of a normal sperm population, the 95% limits on normal morphology were established. Using only four shape features, a doubling dose of approximately 39 rad was determined. That is, at 39 rad exposure the percentage of abnormal cells was twice that occurring in the control population. This compared to a doubling dose of approximately 70 rad obtained from a concurrent visual procedure.

  19. QUANTITATIVE CT ANALYSIS, AIRFLOW OBSTRUCTION AND LUNG CANCER IN THE PITTSBURGH LUNG SCREENING STUDY

    PubMed Central

    Wilson, David O; Leader, Joseph K; Fuhrman, Carl R; Reilly, John J; Sciurba, Frank C.; Weissfeld, Joel L

    2011-01-01

    Background To study the relationship between emphysema, airflow obstruction and lung cancer in a high risk population we performed quantitative analysis of screening computed tomography (CT) scans. Methods Subjects completed questionnaires, spirometry and low-dose helical chest CT. Analyses compared cases and controls according to automated quantitative analysis of lung parenchyma and airways measures. Results Our case-control study of 117 matched pairs of lung cancer cases and controls did not reveal any airway or lung parenchymal findings on quantitative analysis of screening CT scans that were associated with increased lung cancer risk. Airway measures including wall area %, lumen perimeter, lumen area and average wall HU, and parenchymal measures including lung fraction < −910 Hounsfield Units (HU), were not statistically different between cases and controls. Conclusions The relationship between visual assessment of emphysema and increased lung cancer risk could not be verified by quantitative analysis of low-dose screening CT scans in a high risk tobacco exposed population. PMID:21610523

  20. Analysis of Artifacts Suggests DGGE Should Not Be Used For Quantitative Diversity Analysis

    PubMed Central

    Neilson, Julia W.; Jordan, Fiona L.; Maier, Raina M.

    2014-01-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. PMID:23313091

  1. Bridging the gaps for global sustainable development: a quantitative analysis.

    PubMed

    Udo, Victor E; Jansson, Peter Mark

    2009-09-01

    Global human progress occurs in a complex web of interactions between society, technology and the environment as driven by governance and infrastructure management capacity among nations. In our globalizing world, this complex web of interactions over the last 200 years has resulted in the chronic widening of economic and political gaps between the haves and the have-nots with consequential global cultural and ecosystem challenges. At the bottom of these challenges is the issue of resource limitations on our finite planet with increasing population. The problem is further compounded by pleasure-driven and poverty-driven ecological depletion and pollution by the haves and the have-nots respectively. These challenges are explored in this paper as global sustainable development (SD) quantitatively; in order to assess the gaps that need to be bridged. Although there has been significant rhetoric on SD with very many qualitative definitions offered, very few quantitative definitions of SD exist. The few that do exist tend to measure SD in terms of social, energy, economic and environmental dimensions. In our research, we used several human survival, development, and progress variables to create an aggregate SD parameter that describes the capacity of nations in three dimensions: social sustainability, environmental sustainability and technological sustainability. Using our proposed quantitative definition of SD and data from relatively reputable secondary sources, 132 nations were ranked and compared. Our comparisons indicate a global hierarchy of needs among nations similar to Maslow's at the individual level. As in Maslow's hierarchy of needs, nations that are struggling to survive are less concerned with environmental sustainability than advanced and stable nations. Nations such as the United States, Canada, Finland, Norway and others have higher SD capacity, and thus, are higher on their hierarchy of needs than nations such as Nigeria, Vietnam, Mexico and other

  2. Mining biomedical time series by combining structural analysis and temporal abstractions.

    PubMed

    Bellazzi, R; Magni, P; Larizza, C; De Nicolao, G; Riva, A; Stefanelli, M

    1998-01-01

    This paper describes the combination of Structural Time Series analysis and Temporal Abstractions for the interpretation of data coming from home monitoring of diabetic patients. Blood Glucose data are analyzed by a novel Bayesian technique for time series analysis. The results obtained are post-processed using Temporal Abstractions in order to extract knowledge that can be exploited "at the point of use" from physicians. The proposed data analysis procedure can be viewed as a Knowledge Discovery in Data Base process that is applied to time-varying data. The work here described is part of a Web-based telemedicine system for the management of Insulin Dependent Diabetes Mellitus patients, called T-IDDM.

  3. Temporal Binding and Segmentation in Visual Search: A Computational Neuroscience Analysis.

    PubMed

    Mavritsaki, Eirini; Humphreys, Glyn

    2016-10-01

    Human visual search operates not only over space but also over time, as old items remain in the visual field and new items appear. Preview search (where one set of distractors appears before the onset of a second set) has been used as a paradigm to study search over time and space [Watson, D. G., & Humphreys, G. W. Visual marking: Prioritizing selection for new objects by top-down attentional inhibition of old objects. Psychological Review, 104, 90-122, 1997], with participants showing efficient search when old distractors can be ignored and new targets prioritized. The benefits of preview search are lost, however, if a temporal gap is introduced between a first presentation of the old items and the re-presentation of all the items in the search display [Kunar, M. A., Humphreys, G. W., & Smith, K. J. History matters: The preview benefit in search is not onset capture. Psychological Science, 14, 181-185, 2003a], consistent with the old items being bound by temporal onset to the new stimuli. This effect of temporal binding can be eliminated if the old items reappear briefly before the new items, indicating also a role for the memory of the old items. Here we simulate these effects of temporal coding in search using the spiking search over time and space model [Mavritsaki, E., Heinke, D., Allen, H., Deco, G., & Humphreys, G. W. Bridging the gap between physiology and behavior: Evidence from the sSoTS model of human visual attention. Psychological Review, 118, 3-41, 2011]. We show that a form of temporal binding by new onsets has to be introduced to the model to simulate the effects of a temporal gap, but that effects of the memory of the old item can stem from continued neural suppression across a temporal gap. We also show that the model can capture the effects of brain lesion on preview search under the different temporal conditions. The study provides a proof-of-principle analysis that neural suppression and temporal binding can be sufficient to account for human

  4. Modeling of X-Ray Fluorescence for Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Zarkadas, Charalambos

    2010-03-01

    Quantitative XRF algorithms involve mathematical procedures intended to solve a set of equations expressing the total fluorescence intensity of selected X-ray element lines emitted after sample irradiation by a photon source. These equations [1] have been derived under the assumptions of a parallel exciting beam and that of a perfectly flat and uniform sample and have been extended up to date to describe composite cases such as multilayered samples and samples exhibiting particle size effects. In state of the art algorithms the equations include most of the physical processes which can contribute to the measured fluorescence signal and make use of evaluated databases for the Fundamental Parameters included in the calculations. The accuracy of the results obtained depends on a great extent on the completeness of the model used to describe X-ray fluorescence intensities and on the compliance of the actual experimental conditions to the basic assumptions under which the mathematical formulas were derived.

  5. Quantitative analysis of CT scans of ceramic candle filters

    SciTech Connect

    Ferer, M.V.; Smith, D.H.

    1996-12-31

    Candle filters are being developed to remove coal ash and other fine particles (<15{mu}m) from hot (ca. 1000 K) gas streams. In the present work, a color scanner was used to digitize hard-copy CT X-ray images of cylindrical SiC filters, and linear regressions converted the scanned (color) data to a filter density for each pixel. These data, with the aid of the density of SiC, gave a filter porosity for each pixel. Radial averages, density-density correlation functions, and other statistical analyses were performed on the density data. The CT images also detected the presence and depth of cracks that developed during usage of the filters. The quantitative data promise to be a very useful addition to the color images.

  6. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    PubMed Central

    Peng, Xinsheng; Hu, Min; Ling, Yahao; Tian, Yuan; Zhou, Yanxing; Zhou, Yanfang

    2014-01-01

    A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8)-triethylamine (50 : 50 : 0.1%) with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y = 10706x − 2959 (R2 = 1.0). The average recovery is 101.7%; RSD = 2.22%  (n = 9). This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle. PMID:24834359

  7. Cross-bridge model of muscle contraction. Quantitative analysis.

    PubMed Central

    Eisenberg, E; Hill, T L; Chen, Y

    1980-01-01

    We recently presented, in a qualitative manner, a cross-bridge model of muscle contraction which was based on a biochemical kinetic cycle for the actomyosin ATPase activity. This cross-bridge model consisted of two cross-bridge states detached from actin and two cross-bridge states attached to actin. In the present paper, we attempt to fit this model quantitatively to both biochemical and physiological data. We find that the resulting complete cross-bridge model is able to account reasonably well for both the isometric transient data observed when a muscle is subjected to a sudden change in length and for the relationship between the velocity of muscle contraction in vivo and the actomyosin ATPase activity in vitro. This model also illustrates the interrelationship between biochemical and physiological data necessary for the development of a complete cross-bridge model of muscle contraction. PMID:6455168

  8. Analysis of spatial-temporal gene expression patterns reveals dynamics and regionalization in developing mouse brain

    PubMed Central

    Chou, Shen-Ju; Wang, Chindi; Sintupisut, Nardnisa; Niou, Zhen-Xian; Lin, Chih-Hsu; Li, Ker-Chau; Yeang, Chen-Hsiang

    2016-01-01

    Allen Brain Atlas (ABA) provides a valuable resource of spatial/temporal gene expressions in mammalian brains. Despite rich information extracted from this database, current analyses suffer from several limitations. First, most studies are either gene-centric or region-centric, thus are inadequate to capture the superposition of multiple spatial-temporal patterns. Second, standard tools of expression analysis such as matrix factorization can capture those patterns but do not explicitly incorporate spatial dependency. To overcome those limitations, we proposed a computational method to detect recurrent patterns in the spatial-temporal gene expression data of developing mouse brains. We demonstrated that regional distinction in brain development could be revealed by localized gene expression patterns. The patterns expressed in the forebrain, medullary and pontomedullary, and basal ganglia are enriched with genes involved in forebrain development, locomotory behavior, and dopamine metabolism respectively. In addition, the timing of global gene expression patterns reflects the general trends of molecular events in mouse brain development. Furthermore, we validated functional implications of the inferred patterns by showing genes sharing similar spatial-temporal expression patterns with Lhx2 exhibited differential expression in the embryonic forebrains of Lhx2 mutant mice. These analysis outcomes confirm the utility of recurrent expression patterns in studying brain development. PMID:26786896

  9. Quantitative error analysis for computer assisted navigation: a feasibility study

    PubMed Central

    Güler, Ö.; Perwög, M.; Kral, F.; Schwarm, F.; Bárdosi, Z. R.; Göbel, G.; Freysinger, W.

    2013-01-01

    Purpose The benefit of computer-assisted navigation depends on the registration process, at which patient features are correlated to some preoperative imagery. The operator-induced uncertainty in localizing patient features – the User Localization Error (ULE) - is unknown and most likely dominating the application accuracy. This initial feasibility study aims at providing first data for ULE with a research navigation system. Methods Active optical navigation was done in CT-images of a plastic skull, an anatomic specimen (both with implanted fiducials) and a volunteer with anatomical landmarks exclusively. Each object was registered ten times with 3, 5, 7, and 9 registration points. Measurements were taken at 10 (anatomic specimen and volunteer) and 11 targets (plastic skull). The active NDI Polaris system was used under ideal working conditions (tracking accuracy 0.23 mm root mean square, RMS; probe tip calibration was 0.18 mm RMS. Variances of tracking along the principal directions were measured as 0.18 mm2, 0.32 mm2, and 0.42 mm2. ULE was calculated from predicted application accuracy with isotropic and anisotropic models and from experimental variances, respectively. Results The ULE was determined from the variances as 0.45 mm (plastic skull), 0.60 mm (anatomic specimen), and 4.96 mm (volunteer). The predicted application accuracy did not yield consistent values for the ULE. Conclusions Quantitative data of application accuracy could be tested against prediction models with iso- and anisotropic noise models and revealed some discrepancies. This could potentially be due to the facts that navigation and one prediction model wrongly assume isotropic noise (tracking is anisotropic), while the anisotropic noise prediction model assumes an anisotropic registration strategy (registration is isotropic in typical navigation systems). The ULE data are presumably the first quantitative values for the precision of localizing anatomical landmarks and implanted fiducials

  10. Annotating spatio-temporal datasets for meaningful analysis in the Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pebesma, Edzer; Scheider, Simon

    2014-05-01

    More and more environmental datasets that vary in space and time are available in the Web. This comes along with an advantage of using the data for other purposes than originally foreseen, but also with the danger that users may apply inappropriate analysis procedures due to lack of important assumptions made during the data collection process. In order to guide towards a meaningful (statistical) analysis of spatio-temporal datasets available in the Web, we have developed a Higher-Order-Logic formalism that captures some relevant assumptions in our previous work [1]. It allows to proof on meaningful spatial prediction and aggregation in a semi-automated fashion. In this poster presentation, we will present a concept for annotating spatio-temporal datasets available in the Web with concepts defined in our formalism. Therefore, we have defined a subset of the formalism as a Web Ontology Language (OWL) pattern. It allows capturing the distinction between the different spatio-temporal variable types, i.e. point patterns, fields, lattices and trajectories, that in turn determine whether a particular dataset can be interpolated or aggregated in a meaningful way using a certain procedure. The actual annotations that link spatio-temporal datasets with the concepts in the ontology pattern are provided as Linked Data. In order to allow data producers to add the annotations to their datasets, we have implemented a Web portal that uses a triple store at the backend to store the annotations and to make them available in the Linked Data cloud. Furthermore, we have implemented functions in the statistical environment R to retrieve the RDF annotations and, based on these annotations, to support a stronger typing of spatio-temporal datatypes guiding towards a meaningful analysis in R. [1] Stasch, C., Scheider, S., Pebesma, E., Kuhn, W. (2014): "Meaningful spatial prediction and aggregation", Environmental Modelling & Software, 51, 149-165.

  11. Temporal Transcriptomic Analysis as Desulfovibrio vulgaris Hildenborough Transitions into Stationary Phase during Electron Donor Depletion†

    PubMed Central

    Clark, M. E.; He, Q.; He, Z.; Huang, K. H.; Alm, E. J.; Wan, X.-F.; Hazen, T. C.; Arkin, A. P.; Wall, J. D.; Zhou, J.-Z.; Fields, M. W.

    2006-01-01

    Desulfovibrio vulgaris was cultivated in a defined medium, and biomass was sampled for approximately 70 h to characterize the shifts in gene expression as cells transitioned from the exponential to the stationary phase during electron donor depletion. In addition to temporal transcriptomics, total protein, carbohydrate, lactate, acetate, and sulfate levels were measured. The microarray data were examined for statistically significant expression changes, hierarchical cluster analysis, and promoter element prediction and were validated by quantitative PCR. As the cells transitioned from the exponential phase to the stationary phase, a majority of the down-expressed genes were involved in translation and transcription, and this trend continued at the remaining times. There were general increases in relative expression for intracellular trafficking and secretion, ion transport, and coenzyme metabolism as the cells entered the stationary phase. As expected, the DNA replication machinery was down-expressed, and the expression of genes involved in DNA repair increased during the stationary phase. Genes involved in amino acid acquisition, carbohydrate metabolism, energy production, and cell envelope biogenesis did not exhibit uniform transcriptional responses. Interestingly, most phage-related genes were up-expressed at the onset of the stationary phase. This result suggested that nutrient depletion may affect community dynamics and DNA transfer mechanisms of sulfate-reducing bacteria via the phage cycle. The putative feoAB system (in addition to other presumptive iron metabolism genes) was significantly up-expressed, and this suggested the possible importance of Fe2+ acquisition under metal-reducing conditions. The expression of a large subset of carbohydrate-related genes was altered, and the total cellular carbohydrate levels declined during the growth phase transition. Interestingly, the D. vulgaris genome does not contain a putative rpoS gene, a common attribute of

  12. Temporal transcriptomic analysis of Desulfovibrio vulgaris Hildenborough transition into stationary phase growth during electrondonor depletion

    SciTech Connect

    Clark, M.E.; He, Q.; He, Z.; Huang, K.H.; Alm, E.J.; Wan, X.-F.; Hazen, T.C.; Arkin, A.P.; Wall, J.D.; Zhou, J.-Z.; Fields, M.W.

    2006-08-01

    Desulfovibrio vulgaris was cultivated in a defined medium, and biomass was sampled for approximately 70 h to characterize the shifts in gene expression as cells transitioned from the exponential to the stationary phase during electron donor depletion. In addition to temporal transcriptomics, total protein, carbohydrate, lactate, acetate, and sulfate levels were measured. The microarray data were examined for statistically significant expression changes, hierarchical cluster analysis, and promoter element prediction and were validated by quantitative PCR. As the cells transitioned from the exponential phase to the stationary phase, a majority of the down-expressed genes were involved in translation and transcription, and this trend continued at the remaining times. There were general increases in relative expression for intracellular trafficking and secretion, ion transport, and coenzyme metabolism as the cells entered the stationary phase. As expected, the DNA replication machinery was down-expressed, and the expression of genes involved in DNA repair increased during the stationary phase. Genes involved in amino acid acquisition, carbohydrate metabolism, energy production, and cell envelope biogenesis did not exhibit uniform transcriptional responses. Interestingly, most phage-related genes were up-expressed at the onset of the stationary phase. This result suggested that nutrient depletion may affect community dynamics and DNA transfer mechanisms of sulfate-reducing bacteria via the phage cycle. The putative feoAB system (in addition to other presumptive iron metabolism genes) was significantly up-expressed, and this suggested the possible importance of Fe{sup 2+} acquisition under metal-reducing conditions. The expression of a large subset of carbohydrate-related genes was altered, and the total cellular carbohydrate levels declined during the growth phase transition. Interestingly, the D. vulgaris genome does not contain a putative rpoS gene, a common attribute

  13. Neural correlates of viewing paintings: evidence from a quantitative meta-analysis of functional magnetic resonance imaging data.

    PubMed

    Vartanian, Oshin; Skov, Martin

    2014-06-01

    Many studies involving functional magnetic resonance imaging (fMRI) have exposed participants to paintings under varying task demands. To isolate neural systems that are activated reliably across fMRI studies in response to viewing paintings regardless of variation in task demands, a quantitative meta-analysis of fifteen experiments using the activation likelihood estimation (ALE) method was conducted. As predicted, viewing paintings was correlated with activation in a distributed system including the occipital lobes, temporal lobe structures in the ventral stream involved in object (fusiform gyrus) and scene (parahippocampal gyrus) perception, and the anterior insula-a key structure in experience of emotion. In addition, we also observed activation in the posterior cingulate cortex bilaterally-part of the brain's default network. These results suggest that viewing paintings engages not only systems involved in visual representation and object recognition, but also structures underlying emotions and internalized cognitions.

  14. [Research progress of quantitative analysis for respiratory sinus arrhythmia].

    PubMed

    Sun, Congcong; Zhang, Zhengbo; Wang, Buqing; Liu, Hongyun; Ang, Qing; Wang, Weidong

    2011-12-01

    Respiratory sinus arrhythmia (RSA) is known as fluctuations of heart rate associated with breathing. It has been increasingly used as a noninvasive index of cardiac vagal tone in psychophysiological research recently. Its analysis is often influenced or distorted by respiratory parameters, posture and action, etc. This paper reviews five methods of quantification, including the root mean square of successive differences (RMSSD), peak valley RSA (pvRSA), cosinor fitting, spectral analysis, and joint timing-frequency analysis (JTFA). Paced breathing, analysis of covariance, residua method and msRSA per liter tidal volume are adjustment strategies of measurement and analysis of RSA in this article as well. At last, some prospects of solutions of the problems of RSA research are given.

  15. Technical note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, Tim; Bach, Lennart T.; Czerny, Jan; Riebesell, Ulf

    2016-05-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial and temporal decoupling between particle formation in the surface ocean and particle collection in sediment traps at depth often handicaps reconciliation of production and sedimentation even within the euphotic zone. Pelagic mesocosms are restricted to the surface ocean, but have the advantage of being closed systems and are therefore ideally suited to studying how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples for biogeochemical analysis. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently separated from bulk seawater by passive settling, centrifugation or flocculation with ferric chloride, and we discuss the advantages and efficiencies of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranged from fine to coarse silt (2-63 µm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements, and even at very low particle fluxes we were able to get a detailed insight into various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate the processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  16. Fast multidimensional ensemble empirical mode decomposition for the analysis of big spatio-temporal datasets.

    PubMed

    Wu, Zhaohua; Feng, Jiaxin; Qiao, Fangli; Tan, Zhe-Min

    2016-04-13

    In this big data era, it is more urgent than ever to solve two major issues: (i) fast data transmission methods that can facilitate access to data from non-local sources and (ii) fast and efficient data analysis methods that can reveal the key information from the available data for particular purposes. Although approaches in different fields to address these two questions may differ significantly, the common part must involve data compression techniques and a fast algorithm. This paper introduces the recently developed adaptive and spatio-temporally local analysis method, namely the fast multidimensional ensemble empirical mode decomposition (MEEMD), for the analysis of a large spatio-temporal dataset. The original MEEMD uses ensemble empirical mode decomposition to decompose time series at each spatial grid and then pieces together the temporal-spatial evolution of climate variability and change on naturally separated timescales, which is computationally expensive. By taking advantage of the high efficiency of the expression using principal component analysis/empirical orthogonal function analysis for spatio-temporally coherent data, we design a lossy compression method for climate data to facilitate its non-local transmission. We also explain the basic principles behind the fast MEEMD through decomposing principal components instead of original grid-wise time series to speed up computation of MEEMD. Using a typical climate dataset as an example, we demonstrate that our newly designed methods can (i) compress data with a compression rate of one to two orders; and (ii) speed-up the MEEMD algorithm by one to two orders. PMID:26953173

  17. Spatio-temporal analysis of the relationship between WNV dissemination and environmental variables in Indianapolis, USA

    PubMed Central

    Liu, Hua; Weng, Qihao; Gaines, David

    2008-01-01

    Background This study developed a multi-temporal analysis on the relationship between West Nile Virus (WNV) dissemination and environmental variables by using an integrated approach of remote sensing, GIS, and statistical techniques. WNV mosquito cases in seven months (April-October) of the six years (2002–2007) were collected in Indianapolis, USA. Epidemic curves were plotted to identify the temporal outbreaks of WNV. Spatial-temporal analysis and k-mean cluster analysis were further applied to determine the high-risk areas. Finally, the relationship between environmental variables and WNV outbreaks were examined by using Discriminant Analysis. Results The results show that the WNV epidemic curve reached its peak in August for all years in the study area except in 2007, where the peak was reached in July. WNV dissemination started from the central longitudinal corridor of the city and spread out to the east and west. Different years and seasons had different high-risk areas, but the southwest and southeast corners show the highest risk for WNV infection due to their high percentages of agriculture and water sources. Conclusion Major environmental factors contributing to the outbreak of WNV in Indianapolis were the percentages of agriculture and water, total length of streams, and total size of wetlands. This study provides important information for urban public health prevention and management. It also contributes to the optimization of mosquito control and arrangement of future sampling efforts. PMID:19094221

  18. Spatial and Temporal Dust Source Variability in Northern China Identified Using Advanced Remote Sensing Analysis

    NASA Technical Reports Server (NTRS)

    Taramelli, A.; Pasqui, M.; Barbour, J.; Kirschbaum, D.; Bottai, L.; Busillo, C.; Calastrini, F.; Guarnieri, F.; Small, C.

    2013-01-01

    The aim of this research is to provide a detailed characterization of spatial patterns and temporal trends in the regional and local dust source areas within the desert of the Alashan Prefecture (Inner Mongolia, China). This problem was approached through multi-scale remote sensing analysis of vegetation changes. The primary requirements for this regional analysis are high spatial and spectral resolution data, accurate spectral calibration and good temporal resolution with a suitable temporal baseline. Landsat analysis and field validation along with the low spatial resolution classifications from MODIS and AVHRR are combined to provide a reliable characterization of the different potential dust-producing sources. The representation of intra-annual and inter-annual Normalized Difference Vegetation Index (NDVI) trend to assess land cover discrimination for mapping potential dust source using MODIS and AVHRR at larger scale is enhanced by Landsat Spectral Mixing Analysis (SMA). The combined methodology is to determine the extent to which Landsat can distinguish important soils types in order to better understand how soil reflectance behaves at seasonal and inter-annual timescales. As a final result mapping soil surface properties using SMA is representative of responses of different land and soil cover previously identified by NDVI trend. The results could be used in dust emission models even if they are not reflecting aggregate formation, soil stability or particle coatings showing to be critical for accurately represent dust source over different regional and local emitting areas.

  19. Quantitative analysis of numerical solvers for oscillatory biomolecular system models

    PubMed Central

    Quo, Chang F; Wang, May D

    2008-01-01

    Background This article provides guidelines for selecting optimal numerical solvers for biomolecular system models. Because various parameters of the same system could have drastically different ranges from 10-15 to 1010, the ODEs can be stiff and ill-conditioned, resulting in non-unique, non-existing, or non-reproducible modeling solutions. Previous studies have not examined in depth how to best select numerical solvers for biomolecular system models, which makes it difficult to experimentally validate the modeling results. To address this problem, we have chosen one of the well-known stiff initial value problems with limit cycle behavior as a test-bed system model. Solving this model, we have illustrated that different answers may result from different numerical solvers. We use MATLAB numerical solvers because they are optimized and widely used by the modeling community. We have also conducted a systematic study of numerical solver performances by using qualitative and quantitative measures such as convergence, accuracy, and computational cost (i.e. in terms of function evaluation, partial derivative, LU decomposition, and "take-off" points). The results show that the modeling solutions can be drastically different using different numerical solvers. Thus, it is important to intelligently select numerical solvers when solving biomolecular system models. Results The classic Belousov-Zhabotinskii (BZ) reaction is described by the Oregonator model and is used as a case study. We report two guidelines in selecting optimal numerical solver(s) for stiff, complex oscillatory systems: (i) for problems with unknown parameters, ode45 is the optimal choice regardless of the relative error tolerance; (ii) for known stiff problems, both ode113 and ode15s are good choices under strict relative tolerance conditions. Conclusions For any given biomolecular model, by building a library of numerical solvers with quantitative performance assessment metric, we show that it is possible

  20. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China.

    PubMed

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-03-01

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  1. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China

    PubMed Central

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-01-01

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  2. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China.

    PubMed

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-03-07

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  3. Quantitative analysis of TALE-DNA interactions suggests polarity effects.

    PubMed

    Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P

    2013-04-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.

  4. Quantitative analysis of TALE–DNA interactions suggests polarity effects

    PubMed Central

    Meckler, Joshua F.; Bhakta, Mital S.; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H.; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H.; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J.; Baldwin, Enoch P.

    2013-01-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ∼ NN ≫ NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 103-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches ‘standard’ RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5′-T. Another surprising observation was that base mismatches at the 5′ end of the target site had more disruptive effects on affinity than those at the 3′ end, particularly in designed TALEs. These results provide evidence that TALE–DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones. PMID:23408851

  5. On the in vivo action of erythropoietin: a quantitative analysis.

    PubMed

    Papayannopoulou, T; Finch, C A

    1972-05-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance.Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion.

  6. On the in vivo action of erythropoietin: a quantitative analysis

    PubMed Central

    Papayannopoulou, Thalia; Finch, Clement A.

    1972-01-01

    The composite response of the erythron to exogenous erythropoietin has been studied in normal, splenectomized, and polycythemic mice. After stimulation the normal animal doubled its marrow nucleated red cells by the 3rd day with little further change by the 5th. Nucleated red cells within the spleen began to increase sharply on the 2nd day and, by the 5th, exceeded those in the marrow. The total nucleated erythroid response represented a fourfold increase. Reticulocytes lagged behind the expansion of the nucleated red cell mass, but by the 5th day the original ratio was re-established. Hemoglobin synthesis was increased, but the ratio of hemoglobin synthesized in nucleated red cells and reticulocytes was basically unchanged. Early displacement of marrow reticulocytes into circulation and the production of a larger red cell also occurred. No evidence of a change in the number of erythroid mitoses was found; only a slight decrease in the average cell cycle time was demonstrated. Thus, whereas erythropoietin stimulation induced several changes in erythropoiesis, the increased number of cells entering into the maturing pool appeared to be of greatest quantitative significance. Splenectomy reduced the proliferative response of the erythron over 5 days stimulation to three-fourths that found in the normal animal. This difference, also reflected in a proportionately lower reticulocyte response and increment in circulating red cell mass, suggests that erythropoiesis within the mouse marrow is spatially or otherwise restricted and that the spleen provided a supplemental area of erythroid expansion. PMID:5020431

  7. Quantitative Analysis of Synaptic Release at the Photoreceptor Synapse

    PubMed Central

    Duncan, Gabriel; Rabl, Katalin; Gemp, Ian; Heidelberger, Ruth; Thoreson, Wallace B.

    2010-01-01

    Abstract Exocytosis from the rod photoreceptor is stimulated by submicromolar Ca2+ and exhibits an unusually shallow dependence on presynaptic Ca2+. To provide a quantitative description of the photoreceptor Ca2+ sensor for exocytosis, we tested a family of conventional and allosteric computational models describing the final Ca2+-binding steps leading to exocytosis. Simulations were fit to two measures of release, evoked by flash-photolysis of caged Ca2+: exocytotic capacitance changes from individual rods and postsynaptic currents of second-order neurons. The best simulations supported the occupancy of only two Ca2+ binding sites on the rod Ca2+ sensor rather than the typical four or five. For most models, the on-rates for Ca2+ binding and maximal fusion rate were comparable to those of other neurons. However, the off-rates for Ca2+ unbinding were unexpectedly slow. In addition to contributing to the high-affinity of the photoreceptor Ca2+ sensor, slow Ca2+ unbinding may support the fusion of vesicles located at a distance from Ca2+ channels. In addition, partial sensor occupancy due to slow unbinding may contribute to the linearization of the first synapse in vision. PMID:20483317

  8. Quantitative Analysis of CME Deflections in the Corona

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Shen, Chenglong; Wang, Yuming; Ye, Pinzhong; Liu, Jiajia; Wang, Shui; Zhao, Xuepu

    2011-07-01

    In this paper, ten CME events viewed by the STEREO twin spacecraft are analyzed to study the deflections of CMEs during their propagation in the corona. Based on the three-dimensional information of the CMEs derived by the graduated cylindrical shell (GCS) model (Thernisien, Howard, and Vourlidas in Astrophys. J. 652, 1305, 2006), it is found that the propagation directions of eight CMEs had changed. By applying the theoretical method proposed by Shen et al. ( Solar Phys. 269, 389, 2011) to all the CMEs, we found that the deflections are consistent, in strength and direction, with the gradient of the magnetic energy density. There is a positive correlation between the deflection rate and the strength of the magnetic energy density gradient and a weak anti-correlation between the deflection rate and the CME speed. Our results suggest that the deflections of CMEs are mainly controlled by the background magnetic field and can be quantitatively described by the magnetic energy density gradient (MEDG) model.

  9. Quantitative proteomic analysis of amphotericin B resistance in Leishmania infantum

    PubMed Central

    Brotherton, Marie-Christine; Bourassa, Sylvie; Légaré, Danielle; Poirier, Guy G.; Droit, Arnaud; Ouellette, Marc

    2014-01-01

    Amphotericin B (AmB) in its liposomal form is now considered as either first- or second-line treatment against Leishmania infections in different part of the world. Few cases of AmB resistance have been reported and resistance mechanisms toward AmB are still poorly understood. This paper reports a large-scale comparative proteomic study in the context of AmB resistance. Quantitative proteomics using stable isotope labeling of amino acids in cell culture (SILAC) was used to better characterize cytoplasmic and membrane-enriched (ME) proteomes of the in vitro generated Leishmania infantum AmB resistant mutant AmB1000.1. In total, 97 individual proteins were found as differentially expressed between the mutant and its parental sensitive strain (WT). More than half of these proteins were either metabolic enzymes or involved in transcription or translation processes. Key energetic pathways such as glycolysis and TCA cycle were up-regulated in the mutant. Interestingly, many proteins involved in reactive oxygen species (ROS) scavenging and heat-shock proteins were also up-regulated in the resistant mutant. This work provides a basis for further investigations to understand the roles of proteins differentially expressed in relation with AmB resistance. PMID:25057462

  10. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions.

  11. Quantitative Proteome Analysis of Leishmania donovani under Spermidine Starvation

    PubMed Central

    Singh, Shalini; Dubey, Vikash Kumar

    2016-01-01

    We have earlier reported antileishmanial activity of hypericin by spermidine starvation. In the current report, we have used label free proteome quantitation approach to identify differentially modulated proteins after hypericin treatment. A total of 141 proteins were found to be differentially regulated with ANOVA P value less than 0.05 in hypericin treated Leishmania promastigotes. Differentially modulated proteins have been broadly classified under nine major categories. Increase in ribosomal protein S7 protein suggests the repression of translation. Inhibition of proteins related to ubiquitin proteasome system, RNA binding protein and translation initiation factor also suggests altered translation. We have also observed increased expression of Hsp 90, Hsp 83–1 and stress inducible protein 1. Significant decreased level of cyclophilin was observed. These stress related protein could be cellular response of the parasite towards hypericin induced cellular stress. Also, defective metabolism, biosynthesis and replication of nucleic acids, flagellar movement and signalling of the parasite were observed as indicated by altered expression of proteins involved in these pathways. The data was analyzed rigorously to get further insight into hypericin induced parasitic death. PMID:27123864

  12. Analysis of quantitative trait loci for behavioral laterality in mice.

    PubMed Central

    Roubertoux, Pierre L; Le Roy, Isabelle; Tordjman, Sylvie; Cherfou, Améziane; Migliore-Samour, Danièle

    2003-01-01

    Laterality is believed to have genetic components, as has been deduced from family studies in humans and responses to artificial selection in mice, but these genetic components are unknown and the underlying physiological mechanisms are still a subject of dispute. We measured direction of laterality (preferential use of left or right paws) and degree of laterality (absolute difference between the use of left and right paws) in C57BL/6ByJ (B) and NZB/BlNJ (N) mice and in their F(1) and F(2) intercrosses. Measurements were taken of both forepaws and hind paws. Quantitative trait loci (QTL) did not emerge for direction but did for degree of laterality. One QTL for forepaw (LOD score = 5.6) and the second QTL for hind paw (LOD score = 7.2) were both located on chromosome 4 and their peaks were within the same confidence interval. A QTL for plasma luteinizing hormone concentration was also found in the confidence interval of these two QTL. These results suggest that the physiological mechanisms underlying degree of laterality react to gonadal steroids. PMID:12663540

  13. Space-to-Ground Communication for Columbus: A Quantitative Analysis.

    PubMed

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  14. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  15. Quantitative genetic analysis of salicylic acid perception in Arabidopsis.

    PubMed

    Dobón, Albor; Canet, Juan Vicente; Perales, Lorena; Tornero, Pablo

    2011-10-01

    Salicylic acid (SA) is a phytohormone required for a full resistance against some pathogens in Arabidopsis, and NPR1 (Non-Expressor of Pathogenesis Related Genes 1) is the only gene with a strong effect on resistance induced by SA which has been described. There can be additional components of SA perception that escape the traditional approach of mutagenesis. An alternative to that approach is searching in the natural variation of Arabidopsis. Different methods of analyzing the variation between ecotypes have been tried and it has been found that measuring the growth of a virulent isolate of Pseudomonas syringae after the exogenous application of SA is the most effective one. Two ecotypes, Edi-0 and Stw-0, have been crossed, and their F2 has been studied. There are two significant quantitative trait loci (QTLs) in this population, and there is one QTL in each one of the existing mapping populations Col-4 × Laer-0 and Laer-0 × No-0. They have different characteristics: while one QTL is only detectable at low concentrations of SA, the other acts after the point of crosstalk with methyl jasmonate signalling. Three of the QTLs have candidates described in SA perception as NPR1, its interactors, and a calmodulin binding protein.

  16. Quantitative analysis of pheromone-binding protein specificity

    PubMed Central

    Katti, S.; Lokhande, N.; González, D.; Cassill, A.; Renthal, R.

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-phenyl-1-naphthylamine (NPN) binding and Förster resonance energy transfer between LUSH tryptophan 123 (W123) and NPN. Binding of cVA was measured from quenching of W123 fluorescence as a function of cVA concentration. The equilibrium constant for transfer of cVA between β-cyclodextrin and LUSH was determined from a linked equilibria model. This constant, multiplied by the β-cyclodextrin-cVA dissociation constant, gives the LUSH-cVA dissociation constant: ~100 nM. It was also found that other ligands quench W123 fluorescence. The LUSH-ligand dissociation constants were determined to be ~200 nM for the silk moth pheromone bombykol and ~90 nM for methyl oleate. The results indicate that the ligand-binding cavity of LUSH can accommodate a variety ligands with strong binding interactions. Implications of this for the pheromone receptor model proposed by Laughlin et al. (Cell 133: 1255–65, 2008) are discussed. PMID:23121132

  17. Quantitative analysis of virus and plasmid trafficking in cells

    NASA Astrophysics Data System (ADS)

    Lagache, Thibault; Dauty, Emmanuel; Holcman, David

    2009-01-01

    Intracellular transport of DNA carriers is a fundamental step of gene delivery. By combining both theoretical and numerical approaches we study here single and several viruses and DNA particles trafficking in the cell cytoplasm to a small nuclear pore. We present a physical model to account for certain aspects of cellular organization, starting with the observation that a viral trajectory consists of epochs of pure diffusion and epochs of active transport along microtubules. We define a general degradation rate to describe the limitations of the delivery of plasmid or viral particles to a nuclear pore imposed by various types of direct and indirect hydrolysis activity inside the cytoplasm. By replacing the switching dynamics by a single steady state stochastic description, we obtain estimates for the probability and the mean time for the first one of many particles to go from the cell membrane to a small nuclear pore. Computational simulations confirm that our model can be used to analyze and interpret viral trajectories and estimate quantitatively the success of nuclear delivery.

  18. Space-to-Ground Communication for Columbus: A Quantitative Analysis

    PubMed Central

    Uhlig, Thomas; Mannel, Thurid; Fortunato, Antonio; Illmer, Norbert

    2015-01-01

    The astronauts on board the International Space Station (ISS) are only the most visible part of a much larger team engaged around the clock in the performance of science and technical activities in space. The bulk of such team is scattered around the globe in five major Mission Control Centers (MCCs), as well as in a number of smaller payload operations centres. Communication between the crew in space and the flight controllers at those locations is an essential element and one of the key drivers to efficient space operations. Such communication can be carried out in different forms, depending on available technical assets and the selected operational approach for the activity at hand. This paper focuses on operational voice communication and provides a quantitative overview of the balance achieved in the Columbus program between collaborative space/ground operations and autonomous on-board activity execution. An interpretation of the current situation is provided, together with a description of potential future approaches for deep space exploration missions. PMID:26290898

  19. Depression in Parkinson's disease: a quantitative and qualitative analysis.

    PubMed Central

    Gotham, A M; Brown, R G; Marsden, C D

    1986-01-01

    Depression is a common feature of Parkinson's disease, a fact of both clinical and theoretical significance. Assessment of depression in Parkinson's disease is complicated by overlapping symptomatology in the two conditions, making global assessments based on observer or self-ratings of doubtful validity. The present study aimed to provide both a quantitative and qualitative description of the nature of the depressive changes found in Parkinson's disease as compared with normal elderly subjects and arthritis patients. As with previous studies, the patients with Parkinson's disease scored significantly higher than normal controls on various self-ratings of depression and anxiety but, in this study, did not differ from those with arthritis. Qualitatively, both the Parkinson's disease and the arthritis groups had depression characterised by pessimism and hopelessness, decreased motivation and drive, and increased concern with health. In contrast, the negative affective feelings of guilt, self-blame and worthlessness were absent in both patient groups. This pattern of depression was significantly associated with severity of illness and functional disability. However, these factors account for only a modest proportion of the variability in test scores. Probable unexplored factors are individual differences in coping style and availability of support. PMID:3701347

  20. Quantitative proteomic analysis of the Salmonella-lettuce interaction.

    PubMed

    Zhang, Yuping; Nandakumar, Renu; Bartelt-Hunt, Shannon L; Snow, Daniel D; Hodges, Laurie; Li, Xu

    2014-11-01

    Human pathogens can internalize food crops through root and surface uptake and persist inside crop plants. The goal of the study was to elucidate the global modulation of bacteria and plant protein expression after Salmonella internalizes lettuce. A quantitative proteomic approach was used to analyse the protein expression of Salmonella enterica serovar Infantis and lettuce cultivar Green Salad Bowl 24 h after infiltrating S. Infantis into lettuce leaves. Among the 50 differentially expressed proteins identified by comparing internalized S. Infantis against S. Infantis grown in Luria Broth, proteins involved in glycolysis were down-regulated, while one protein involved in ascorbate uptake was up-regulated. Stress response proteins, especially antioxidant proteins, were up-regulated. The modulation in protein expression suggested that internalized S. Infantis might utilize ascorbate as a carbon source and require multiple stress response proteins to cope with stresses encountered in plants. On the other hand, among the 20 differentially expressed lettuce proteins, proteins involved in defense response to bacteria were up-regulated. Moreover, the secreted effector PipB2 of S. Infantis and R proteins of lettuce were induced after bacterial internalization into lettuce leaves, indicating human pathogen S. Infantis triggered the defense mechanisms of lettuce, which normally responds to plant pathogens.

  1. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  2. An Introduction to Error Analysis for Quantitative Chemistry

    ERIC Educational Resources Information Center

    Neman, R. L.

    1972-01-01

    Describes two formulas for calculating errors due to instrument limitations which are usually found in gravimetric volumetric analysis and indicates their possible applications to other fields of science. (CC)

  3. Temporal discounting: basic research and the analysis of socially important behavior.

    PubMed Central

    Critchfield, T S; Kollins, S H

    2001-01-01

    Recent basic research on human temporal discounting is reviewed to illustrate procedures, summarize key findings, and draw parallels with both nonhuman animal research and conceptual writings on self-control. Lessons derived from this research are then applied to the challenge of analyzing socially important behaviors such as drug abuse, eating and exercise, and impulsiveness associated with attention deficit hyperactivity disorder. Attending to the broader temporal context in which behavior occurs may aid in the analysis of socially important behavior. Applying this perspective to the study of behavior in natural environments also highlights the importance of combining methodological flexibility with conceptual rigor to promote the extension of applied behavior analysis to a broader array of socially important behaviors. PMID:11317983

  4. The cortical analysis of speech-specific temporal structure revealed by responses to sound quilts

    PubMed Central

    Overath, Tobias; McDermott, Josh H; Zarate, Jean Mary; Poeppel, David

    2016-01-01

    Speech contains temporal structure that the brain must analyze to enable linguistic processing. To investigate the neural basis of this analysis, we used sound quilts, stimuli constructed by shuffling segments of a natural sound, approximately preserving its properties on short timescales while disrupting them on longer scales. We generated quilts from foreign speech to eliminate language cues and manipulated the extent of natural acoustic structure by varying the segment length. Using functional magnetic resonance imaging, we identified bilateral regions of the superior temporal sulcus (STS) whose responses varied with segment length. This effect was absent in primary auditory cortex and did not occur for quilts made from other natural sounds or acoustically matched synthetic sounds, suggesting tuning to speech-specific spectrotemporal structure. When examined parametrically, the STS response increased with segment length up to ~500 ms. Our results identify a locus of speech analysis in human auditory cortex that is distinct from lexical, semantic or syntactic processes. PMID:25984889

  5. The cortical analysis of speech-specific temporal structure revealed by responses to sound quilts.

    PubMed

    Overath, Tobias; McDermott, Josh H; Zarate, Jean Mary; Poeppel, David

    2015-06-01

    Speech contains temporal structure that the brain must analyze to enable linguistic processing. To investigate the neural basis of this analysis, we used sound quilts, stimuli constructed by shuffling segments of a natural sound, approximately preserving its properties on short timescales while disrupting them on longer scales. We generated quilts from foreign speech to eliminate language cues and manipulated the extent of natural acoustic structure by varying the segment length. Using functional magnetic resonance imaging, we identified bilateral regions of the superior temporal sulcus (STS) whose responses varied with segment length. This effect was absent in primary auditory cortex and did not occur for quilts made from other natural sounds or acoustically matched synthetic sounds, suggesting tuning to speech-specific spectrotemporal structure. When examined parametrically, the STS response increased with segment length up to ∼500 ms. Our results identify a locus of speech analysis in human auditory cortex that is distinct from lexical, semantic or syntactic processes. PMID:25984889

  6. Flash Floods: A Spatial and Temporal Analysis A Case Study of the Flash Floods in Southwestern Missouri

    NASA Astrophysics Data System (ADS)

    Pollak, D. A.

    2009-12-01

    Floods are the leading cause of weather-related fatalities in the United States and the second most common cause of weather-related death worldwide. Of all floods, flash floods are found to be the most deadly with an average global mortality rate of 3.62%. This study performed a spatial and temporal analysis of flash flood risk using Geographic Information Systems (GIS) to analyze an event that occurred in southwestern Missouri in March 2008. The goals of the study were to identify different impacts of the disaster as indicated by flood reports; examine the social and natural factors that account for the spatial and temporal distribution and severity of the impacts; and to compare the results with a previous study of a flash flood event in France. In the Missouri case, the analysis confirmed that small catchments react faster than large catchments and to smaller amounts of rainfall. Seventy-seven percent of incidents (water rescues, fatalities, flooded homes, and flooded roads) occurred in catchments smaller than 200 km2. The study also showed that a majority of the severe impacts (water rescues, fatalities, and flooded homes) occurred after the rainfall had tapered off, perhaps indicating that people erroneously perceive the danger has passed once the rain has stopped. When looking at water rescues and fatalities, the results of this study found that fewer incidents occurred in medium size catchments (50-450 km2) than in either large (>450 km2) or small catchments (<50 km2). This is similar to the French study which found that no fatalities occurred in medium-sized catchments. Using GIS to examine flash flood risk, will take strides forward comparing human vulnerability with size of catchment. This project integrated qualitative and quantitative data using GIS. Displays the time series before, during, and after the flood report/accident which occurred in catchment 5.

  7. China's water resources vulnerability: A spatio-temporal analysis during 2003-2013

    NASA Astrophysics Data System (ADS)

    Cai, J.; Varis, O.; Yin, H.

    2015-12-01

    The present highly serious situation of China's water environment and aquatic ecosystems has occurred in the context of its stunning socioeconomic development over the past several decades. Therefore, an analysis with a high spatio-temporal resolution of the vulnerability assessment of water resources (VAWR) in China is burningly needed. However, to our knowledge, the temporal analysis of VAWR has been not yet addressed. Consequently, we performed, for the first time, a comprehensive spatio-temporal analysis of China's water resources vulnerability (WRV), using a composite index approach with an array of aspects highlighting key challenges that China's water resources system is nowadays facing. During our study period of 2003-2013, the political weight of China's integrated water resources management has been increasing continuously. Hence, it is essential and significant, based on the historical socioeconomic changes influenced by water-environment policy making and implementation, to reveal China's WRV for pinpointing key challenges to the healthy functionality of its water resources system. The water resources system in North and Central Coast appeared more vulnerable than that in Western China. China's water use efficiency has grown substantially over the study period, and so is water supply and sanitation coverage. In contrast, water pollution has been worsening remarkably in most parts of China, and so have water scarcity and shortage in the most stressed parts of the country. This spatio-temporal analysis implies that the key challenges to China's water resources system not only root in the geographical mismatch between socioeconomic development (e.g. water demand) and water resources endowments (e.g. water resources availability), but also stem from the intertwinement between socioeconomic development and national strategic policy making.

  8. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    PubMed Central

    Cao, Kai; Yang, Kun; Wang, Chao; Guo, Jin; Tao, Lixin; Liu, Qingrong; Gehendra, Mahara; Zhang, Yingjie; Guo, Xiuhua

    2016-01-01

    Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC) value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660). Ecological analysis results showed the relative risks (RRs) of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550), 1.01010 (95% CI, 1.01007–1.01013), 0.83518 (95% CI, 0.93732–0.96138), 0.97496 (95% CI, 0.97181–1.01386), and 1.01007 (95% CI, 1.01003–1.01011), respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis. PMID:27164117

  9. EEG source imaging with spatio-temporal tomographic nonnegative independent component analysis.

    PubMed

    Valdés-Sosa, Pedro A; Vega-Hernández, Mayrim; Sánchez-Bornot, José Miguel; Martínez-Montes, Eduardo; Bobes, María Antonieta

    2009-06-01

    This article describes a spatio-temporal EEG/MEG source imaging (ESI) that extracts a parsimonious set of "atoms" or components, each the outer product of both a spatial and a temporal signature. The sources estimated are localized as smooth, minimally overlapping patches of cortical activation that are obtained by constraining spatial signatures to be nonnegative (NN), orthogonal, sparse, and smooth-in effect integrating ESI with NN-ICA. This constitutes a generalization of work by this group on the use of multiple penalties for ESI. A multiplicative update algorithm is derived being stable, fast and converging within seconds near the optimal solution. This procedure, spatio-temporal tomographic NN ICA (STTONNICA), is equally able to recover superficial or deep sources without additional weighting constraints as tested with simulations. STTONNICA analysis of ERPs to familiar and unfamiliar faces yields an occipital-fusiform atom activated by all faces and a more frontal atom that only is active with familiar faces. The temporal signatures are at present unconstrained but can be required to be smooth, complex, or following a multivariate autoregressive model.

  10. Whole-Genome Analysis of Temporal Gene Expression during Foregut Development

    PubMed Central

    2004-01-01

    We have investigated the cis-regulatory network that mediates temporal gene expression during organogenesis. Previous studies demonstrated that the organ selector gene pha-4/FoxA is critical to establish the onset of transcription of Caenorhabditis elegans foregut (pharynx) genes. Here, we discover additional cis-regulatory elements that function in combination with PHA-4. We use a computational approach to identify candidate cis-regulatory sites for genes activated either early or late during pharyngeal development. Analysis of natural or synthetic promoters reveals that six of these sites function in vivo. The newly discovered temporal elements, together with predicted PHA-4 sites, account for the onset of expression of roughly half of the pharyngeal genes examined. Moreover, combinations of temporal elements and PHA-4 sites can be used in genome-wide searches to predict pharyngeal genes, with more than 85% accuracy for their onset of expression. These findings suggest a regulatory code for temporal gene expression during foregut development and provide a means to predict gene expression patterns based solely on genomic sequence. PMID:15492775

  11. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  12. Software applications toward quantitative metabolic flux analysis and modeling.

    PubMed

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  13. Quantitative analysis of nailfold capillary morphology in patients with fibromyalgia

    PubMed Central

    Choi, Dug-Hyun

    2015-01-01

    Background/Aims Nailfold capillaroscopy (NFC) has been used to examine morphological and functional microcirculation changes in connective tissue diseases. It has been demonstrated that NFC patterns reflect abnormal microvascular dynamics, which may play a role in fibromyalgia (FM) syndrome. The aim of this study was to determine NFC patterns in FM, and their association with clinical features of FM. Methods A total of 67 patients with FM, and 30 age- and sex-matched healthy controls, were included. Nailfold capillary patterns were quantitatively analyzed using computerized NFC. The parameters of interest were as follows: number of capillaries within the central 3 mm, deletion score, apical limb width, capillary width, and capillary dimension. Capillary dimension was determined by calculating the number of capillaries using the Adobe Photoshop version 7.0. Results FM patients had a lower number of capillaries and higher deletion scores on NFC compared to healthy controls (17.3 ± 1.7 vs. 21.8 ± 2.9, p < 0.05; 2.2 ± 0.9 vs. 0.7 ± 0.6, p < 0.05, respectively). Both apical limb width (µm) and capillary width (µm) were significantly decreased in FM patients (1.1 ± 0.2 vs. 3.7 ± 0.6; 5.4 ± 0.5 vs. 7.5 ± 1.4, respectively), indicating that FM patients have abnormally decreased digital capillary diameter and density. Interestingly, there was no difference in capillary dimension between the two groups, suggesting that the length or tortuosity of capillaries in FM patients is increased to compensate for diminished microcirculation. Conclusions FM patients had altered capillary density and diameter in the digits. Diminished microcirculation on NFC may alter capillary density and increase tortuosity. PMID:26161020

  14. Quantitative analysis of wrist electrodermal activity during sleep

    PubMed Central

    Sano, Akane; Picard, Rosalind W.; Stickgold, Robert

    2015-01-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 healthy adults in a sleep laboratory, with the latter compared to concurrent polysomnography. While high frequency patterns of EDA called “storms” were identified by eye in the 1960’s, we systematically compare thresholds for automatically detecting EDA peaks and establish criteria for EDA storms. We found that more than 80% of EDA peaks occurred in non-REM sleep, specifically during slow-wave sleep (SWS) and non-REM stage 2 sleep (NREM2). Also, EDA amplitude is higher in SWS than in other sleep stages. Longer EDA storms were more likely in the first two quarters of sleep and during SWS and NREM2. We also found from the home studies (65 nights) that EDA levels were higher and the skin conductance peaks were larger and more frequent when measured on the wrist than when measured on the palm. These EDA high frequency peaks and high amplitude were sometimes associated with higher skin temperature, but more work is needed looking at neurological and other EDA elicitors in order to elucidate their complete behavior. PMID:25286449

  15. Laser speckle imaging of rat retinal blood flow with hybrid temporal and spatial analysis method

    NASA Astrophysics Data System (ADS)

    Cheng, Haiying; Yan, Yumei; Duong, Timothy Q.

    2009-02-01

    Noninvasive monitoring of blood flow in retinal circulation will reveal the progression and treatment of ocular disorders, such as diabetic retinopathy, age-related macular degeneration and glaucoma. A non-invasive and direct BF measurement technique with high spatial-temporal resolution is needed for retinal imaging. Laser speckle imaging (LSI) is such a method. Currently, there are two analysis methods for LSI: spatial statistics LSI (SS-LSI) and temporal statistical LSI (TS-LSI). Comparing these two analysis methods, SS-LSI has higher signal to noise ratio (SNR) and TSLSI is less susceptible to artifacts from stationary speckle. We proposed a hybrid temporal and spatial analysis method (HTS-LSI) to measure the retinal blood flow. Gas challenge experiment was performed and images were analyzed by HTS-LSI. Results showed that HTS-LSI can not only remove the stationary speckle but also increase the SNR. Under 100% O2, retinal BF decreased by 20-30%. This was consistent with the results observed with laser Doppler technique. As retinal blood flow is a critical physiological parameter and its perturbation has been implicated in the early stages of many retinal diseases, HTS-LSI will be an efficient method in early detection of retina diseases.

  16. A geomatic methodology for spatio-temporal analysis of climatologic variables and water related diseases

    NASA Astrophysics Data System (ADS)

    Quentin, E.; Gómez Albores, M. A.; Díaz Delgado, C.

    2009-04-01

    The main objective of this research is to propose, by the way of geomatic developments, an integrated tool to analyze and model the spatio-temporal pattern of human diseases related to environmental conditions, in particular the ones that are linked to water resources. The geomatic developments follows four generic steps : requirement analysis, conceptual modeling, geomatic modeling and implementation (in Idrisi GIS software). A first development consists of the preprocessing of water, population and health data in order to facilitate the conversion and validation of tabular data into the required structure for spatio-temporal analysis. Three parallel developments follow : water balance, demographic state and evolution, epidemiological measures (morbidity and mortality rates, diseases burden). The new geomatic modules in their actual state have been tested on various regions of Mexico Republic (Lerma watershed, Chiapas state) focusing on diarrhea and vector borne diseases (dengue and malaria) and considering records over the last decade : a yearly as well as seasonal spreading trend can be observed in correlation with precipitation and temperature data. In an ecohealth perspective, the geomatic approach results particularly appropriate since one of its purposes is the integration of the various spatial themes implied in the study problem, environmental as anthropogenic. By the use of powerful spatial analysis functions, it permits the detection of spatial trends which, combined to the temporal evolution, can be of particularly use for example in climate change context, if sufficiently valid historical data can be obtain.

  17. Joint modality fusion and temporal context exploitation for semantic video analysis

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.

    2011-12-01

    In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.

  18. Balancing Yin and Yang: Teaching and Learning Qualitative Data Analysis Within an Undergraduate Quantitative Data Analysis Course.

    ERIC Educational Resources Information Center

    Clark, Roger; Lang, Angela

    2002-01-01

    Describes an undergraduate sociology course that taught qualitative and quantitative data analysis. Focuses on two students and how they dealt with and overcame anxiety issues, subsequently achieving higher levels of learning and new learning strategies. (KDR)

  19. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI.

  20. Physically-based quantitative analysis of soil erosion induced by heavy rainfall on steep slopes

    NASA Astrophysics Data System (ADS)

    Della Sala, Maria; Cuomo, Sabatino; Novità, Antonio

    2014-05-01

    Heavy rainstorms cause either shallow landslides or soil superficial erosion in steep hillslopes covered by coarse unsaturated soils (Cascini et al., 2013), even over large areas (Cuomo and Della Sala, 2013a). The triggering stage of both phenomena is related to ground infiltration, runoff and overland flow (Cuomo and Della Sala, 2013), which are key processes to be investigated. In addition, the mobilization of solid particles deserves a proper physical-based modeling whether a quantitative estimation of solid particles discharge at the outlet of mountain basin is required. In this work, the approaches for soil superficial erosion analysis are firstly reviewed; then, a relevant case study of two medium-sized mountain basins, affected by flow-like phenomena with huge consequences (Cascini et al., 2009) is presented, which motivates a parametric numerical analysis with a physically-based model carried out for a wide class of soil properties and rainfall scenarios (Cuomo et al., 2013b). The achieved results outline that the peak discharge of water and solid particles driven by overland flow depends on rainfall intensity while volumetric solid concentration within the washout is related to the morphometric features of the whole mountain basin. Furthermore, soil suction is outlined as a key factor for the spatial-temporal evolution of infiltration and runoff in the basin, also affecting the discharge of water and solid particles at the outlet of the basin. Based on these insights, selected cases are analyzed aimed to provide a wide class of possible slope erosion scenarios. It is shown that, provided the same amount of cumulated rainfall, the sequence of high and low intensity rainfall events strongly affects the time-discharge at the outlet of the basin without significant variations of the maximum volumetric solid concentration. References Cascini, L., Cuomo, S., Ferlisi, S., Sorbino, G. (2009). Detection of mechanisms for destructive landslides in Campania region

  1. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution…

  2. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  3. Concentration Analysis: A Quantitative Assessment of Student States.

    ERIC Educational Resources Information Center

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  4. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  5. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/.

  6. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis

    PubMed Central

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-01-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  7. EXPLoRA-web: linkage analysis of quantitative trait loci using bulk segregant analysis.

    PubMed

    Pulido-Tamayo, Sergio; Duitama, Jorge; Marchal, Kathleen

    2016-07-01

    Identification of genomic regions associated with a phenotype of interest is a fundamental step toward solving questions in biology and improving industrial research. Bulk segregant analysis (BSA) combined with high-throughput sequencing is a technique to efficiently identify these genomic regions associated with a trait of interest. However, distinguishing true from spuriously linked genomic regions and accurately delineating the genomic positions of these truly linked regions requires the use of complex statistical models currently implemented in software tools that are generally difficult to operate for non-expert users. To facilitate the exploration and analysis of data generated by bulked segregant analysis, we present EXPLoRA-web, a web service wrapped around our previously published algorithm EXPLoRA, which exploits linkage disequilibrium to increase the power and accuracy of quantitative trait loci identification in BSA analysis. EXPLoRA-web provides a user friendly interface that enables easy data upload and parallel processing of different parameter configurations. Results are provided graphically and as BED file and/or text file and the input is expected in widely used formats, enabling straightforward BSA data analysis. The web server is available at http://bioinformatics.intec.ugent.be/explora-web/. PMID:27105844

  8. Biomolecular computation with molecular beacons for quantitative analysis of target nucleic acids.

    PubMed

    Lim, Hee-Woong; Lee, Seung Hwan; Yang, Kyung-Ae; Yoo, Suk-In; Park, Tai Hyun; Zhang, Byoung-Tak

    2013-01-01

    Molecular beacons are efficient and useful tools for quantitative detection of specific target nucleic acids. Thanks to their simple protocol, molecular beacons have great potential as substrates for biomolecular computing. Here we present a molecular beacon-based biomolecular computing method for quantitative detection and analysis of target nucleic acids. Whereas the conventional quantitative assays using fluorescent dyes have been designed for single target detection or multiplexed detection, the proposed method enables us not only to detect multiple targets but also to compute their quantitative information by weighted-sum of the targets. The detection and computation are performed on a molecular level simultaneously, and the outputs are detected as fluorescence signals. Experimental results show the feasibility and effectiveness of our weighted detection and linear combination method using molecular beacons. Our method can serve as a primitive operation of molecular pattern analysis, and we demonstrate successful binary classifications of molecular patterns made of synthetic oligonucleotide DNA molecules.

  9. Quantitative analysis of cell-free DNA in ovarian cancer

    PubMed Central

    SHAO, XUEFENG; He, YAN; JI, MIN; CHEN, XIAOFANG; QI, JING; SHI, WEI; HAO, TIANBO; JU, SHAOQING

    2015-01-01

    The aim of the present study was to investigate the association between cell-free DNA (cf-DNA) levels and clinicopathological characteristics of patients with ovarian cancer using a branched DNA (bDNA) technique, and to determine the value of quantitative cf-DNA detection in assisting with the diagnosis of ovarian cancer. Serum specimens were collected from 36 patients with ovarian cancer on days 1, 3 and 7 following surgery, and additional serum samples were also collected from 22 benign ovarian tumor cases, and 19 healthy, non-cancerous ovaries. bDNA techniques were used to detect serum cf-DNA concentrations. All data were analyzed using SPSS version 18.0. The cf-DNA levels were significantly increased in the ovarian cancer group compared with those of the benign ovarian tumor group and healthy ovarian group (P<0.01). Furthermore, cf-DNA levels were significantly increased in stage III and IV ovarian cancer compared with those of stages I and II (P<0.01). In addition, cf-DNA levels were significantly increased on the first day post-surgery (P<0.01), and subsequently demonstrated a gradual decrease. In the ovarian cancer group, the area under the receiver operating characteristic curve of cf-DNA and the sensitivity were 0.917 and 88.9%, respectively, which was higher than those of cancer antigen 125 (0.724, 75%) and human epididymis protein 4 (0.743, 80.6%). There was a correlation between the levels of serum cf-DNA and the occurrence and development of ovarian cancer in the patients evaluated. bDNA techniques possessed higher sensitivity and specificity than other methods for the detection of serum cf-DNA in patients exhibiting ovarian cancer, and bDNA techniques are more useful for detecting cf-DNA than other factors. Thus, the present study demonstrated the potential value for the use of bDNA as an adjuvant diagnostic method for ovarian cancer. PMID:26788153

  10. On the quantitative analysis and evaluation of magnetic hysteresis data

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Solheid, Peter

    2010-04-01

    Magnetic hysteresis data are centrally important in pure and applied rock magnetism, but to date, no objective quantitative methods have been developed for assessment of data quality and of the uncertainty in parameters calculated from imperfect data. We propose several initial steps toward such assessment, using loop symmetry as an important key. With a few notable exceptions (e.g., related to field cooling and exchange bias), magnetic hysteresis loops possess a high degree of inversion symmetry (M(H) = -M(-H)). This property enables us to treat the upper and lower half-loops as replicate measurements for quantification of random noise, drift, and offsets. This, in turn, makes it possible to evaluate the statistical significance of nonlinearity, either in the high-field region (due to nonsaturation of the ferromagnetic moment) or over the complete range of applied fields (due to nonnegligible contribution of ferromagnetic phases to the total magnetic signal). It also allows us to quantify the significance of fitting errors for model loops constructed from analytical basis functions. When a statistically significant high-field nonlinearity is found, magnetic parameters must be calculated by approach-to-saturation fitting, e.g., by a model of the form M(H) = Ms + χHFH + αHβ. This nonlinear high-field inverse modeling problem is strongly ill conditioned, resulting in large and strongly covariant uncertainties in the fitted parameters, which we characterize through bootstrap analyses. For a variety of materials, including ferrihydrite and mid-ocean ridge basalts, measured in applied fields up to about 1.5 T, we find that the calculated value of the exponent β is extremely sensitive to small differences in the data or in the method of processing and that the overall uncertainty exceeds the range of physically reasonable values. The "unknowability" of β is accompanied by relatively large uncertainties in the other parameters, which can be characterized, if not

  11. Multi-temporal MRI carpal bone volumes analysis by principal axes registration

    NASA Astrophysics Data System (ADS)

    Ferretti, Roberta; Dellepiane, Silvana

    2016-03-01

    In this paper, a principal axes registration technique is presented, with the relevant application to segmented volumes. The purpose of the proposed registration is to compare multi-temporal volumes of carpal bones from Magnetic Resonance Imaging (MRI) acquisitions. Starting from the study of the second-order moment matrix, the eigenvectors are calculated to allow the rotation of volumes with respect to reference axes. Then the volumes are spatially translated to become perfectly overlapped. A quantitative evaluation of the results obtained is carried out by computing classical indices from the confusion matrix, which depict similarity measures between the volumes of the same organ as extracted from MRI acquisitions executed at different moments. Within the medical field, the way a registration can be used to compare multi-temporal images is of great interest, since it provides the physician with a tool which allows a visual monitoring of a disease evolution. The segmentation method used herein is based on the graph theory and is a robust, unsupervised and parameters independent method. Patients affected by rheumatic diseases have been considered.

  12. Probabilistic reliability analysis, quantitative safety goals, and nuclear licensing in the United Kingdom.

    PubMed

    Cannell, W

    1987-09-01

    Although unpublicized, the use of quantitative safety goals and probabilistic reliability analysis for licensing nuclear reactors has become a reality in the United Kingdom. This conclusion results from an examination of the process leading to the licensing of the Sizewell B PWR in England. The licensing process for this reactor has substantial implications for nuclear safety standards in Britain, and is examined in the context of the growing trend towards quantitative safety goals in the United States. PMID:3685540

  13. Concentration analysis: A quantitative assessment of student states

    NASA Astrophysics Data System (ADS)

    Bao, Lei; Redish, Edward F.

    2001-07-01

    Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution of wrong answers given by the class. In this paper we introduce a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. This information can be used to study if the students have common incorrect models or if the question is effective in detecting student models. When combined with information obtained from qualitative research, the method allows us to identify cleanly what FCI results are telling us about student knowledge.

  14. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    PubMed

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI. PMID:27576710

  15. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    NASA Astrophysics Data System (ADS)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  16. Development of spatial-temporal ventilation heterogeneity and probability analysis tools for hyperpolarized 3He magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Choy, S.; Ahmed, H.; Wheatley, A.; McCormack, D. G.; Parraga, G.

    2010-03-01

    We developed image analysis tools to evaluate spatial and temporal 3He magnetic resonance imaging (MRI) ventilation in asthma and cystic fibrosis. We also developed temporal ventilation probability maps to provide a way to describe and quantify ventilation heterogeneity over time, as a way to test respiratory exacerbations or treatment predictions and to provide a discrete probability measurement of 3He ventilation defect persistence.

  17. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  18. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-01-01

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation. PMID:27501287

  19. Quantitative Analysis of PMLA Nanoconjugate Components after Backbone Cleavage

    PubMed Central

    Ding, Hui; Patil, Rameshwar; Portilla-Arias, Jose; Black, Keith L.; Ljubimova, Julia Y.; Holler, Eggehard

    2015-01-01

    Multifunctional polymer nanoconjugates containing multiple components show great promise in cancer therapy, but in most cases complete analysis of each component is difficult. Polymalic acid (PMLA) based nanoconjugates have demonstrated successful brain and breast cancer treatment. They consist of multiple components including targeting antibodies, Morpholino antisense oligonucleotides (AONs), and endosome escape moieties. The component analysis of PMLA nanoconjugates is extremely difficult using conventional spectrometry and HPLC method. Taking advantage of the nature of polyester of PMLA, which can be cleaved by ammonium hydroxide, we describe a method to analyze the content of antibody and AON within nanoconjugates simultaneously using SEC-HPLC by selectively cleaving the PMLA backbone. The selected cleavage conditions only degrade PMLA without affecting the integrity and biological activity of the antibody. Although the amount of antibody could also be determined using the bicinchoninic acid (BCA) method, our selective cleavage method gives more reliable results and is more powerful. Our approach provides a new direction for the component analysis of polymer nanoconjugates and nanoparticles. PMID:25894227

  20. Field-Based High-Throughput Plant Phenotyping Reveals the Temporal Patterns of Quantitative Trait Loci Associated with Stress-Responsive Traits in Cotton

    PubMed Central

    Pauli, Duke; Andrade-Sanchez, Pedro; Carmo-Silva, A. Elizabete; Gazave, Elodie; French, Andrew N.; Heun, John; Hunsaker, Douglas J.; Lipka, Alexander E.; Setter, Tim L.; Strand, Robert J.; Thorp, Kelly R.; Wang, Sam; White, Jeffrey W.; Gore, Michael A.

    2016-01-01

    The application of high-throughput plant phenotyping (HTPP) to continuously study plant populations under relevant growing conditions creates the possibility to more efficiently dissect the genetic basis of dynamic adaptive traits. Toward this end, we employed a field-based HTPP system that deployed sets of sensors to simultaneously measure canopy temperature, reflectance, and height on a cotton (Gossypium hirsutum L.) recombinant inbred line mapping population. The evaluation trials were conducted under well-watered and water-limited conditions in a replicated field experiment at a hot, arid location in central Arizona, with trait measurements taken at different times on multiple days across 2010–2012. Canopy temperature, normalized difference vegetation index (NDVI), height, and leaf area index (LAI) displayed moderate-to-high broad-sense heritabilities, as well as varied interactions among genotypes with water regime and time of day. Distinct temporal patterns of quantitative trait loci (QTL) expression were mostly observed for canopy temperature and NDVI, and varied across plant developmental stages. In addition, the strength of correlation between HTPP canopy traits and agronomic traits, such as lint yield, displayed a time-dependent relationship. We also found that the genomic position of some QTL controlling HTPP canopy traits were shared with those of QTL identified for agronomic and physiological traits. This work demonstrates the novel use of a field-based HTPP system to study the genetic basis of stress-adaptive traits in cotton, and these results have the potential to facilitate the development of stress-resilient cotton cultivars. PMID:26818078

  1. An In Vivo Reporter to Quantitatively and Temporally Analyze the Effects of CDK4/6 Inhibitor-Based Therapies in Melanoma.

    PubMed

    Teh, Jessica L F; Purwin, Timothy J; Greenawalt, Evan J; Chervoneva, Inna; Goldberg, Allison; Davies, Michael A; Aplin, Andrew E

    2016-09-15

    Aberrant cell-cycle progression is a hallmark feature of cancer cells. Cyclin-dependent kinases 4 and 6 (CDK4/6) drive progression through the G1 stage of the cell cycle, at least in part, by inactivating the tumor suppressor, retinoblastoma. CDK4/6 are targetable and the selective CDK4/6 inhibitor, palbociclib, was recently FDA approved for the treatment of estrogen receptor-positive, HER2-negative advanced breast cancer. In cutaneous melanoma, driver mutations in NRAS and BRAF promote CDK4/6 activation, suggesting that inhibitors such as palbociclib are likely to provide therapeutic benefit in combination with BRAF inhibitors and/or MEK inhibitors that are FDA-approved. However, the determinants of the response to CDK4/6 inhibitors alone and in combination with other targeted inhibitors are poorly defined. Furthermore, in vivo systems to quantitatively and temporally measure the efficacy of CDK4/6 inhibitors and determine the extent that CDK activity is reactivated during acquired resistance are lacking. Here, we describe the heterogeneous effects of CDK4/6 inhibitors, the expression of antiapoptotic proteins that associate with response to CDK4/6 and MEK inhibitors, and the development of a luciferase-based reporter system to determine the effects of CDK4/6 inhibitors alone and in combination with MEK inhibitors in melanoma xenografts. These findings are likely to inform on-going and future clinical trials utilizing CDK4/6 inhibitors in cutaneous melanoma. Cancer Res; 76(18); 5455-66. ©2016 AACR. PMID:27488531

  2. The Quantitative Analysis of an Analgesic Tablet: An NMR Experiment for the Instrumental Analysis Course

    NASA Astrophysics Data System (ADS)

    Schmedake, Thomas A.; Welch, Lawrence E.

    1996-11-01

    A quantitative analysis experiment is outlined that uses 13C NMR. Initial work utilizes a known compound (acenapthene) to assess the type of NMR experiment necessary to achieve a proportional response from all of the carbons in the compound. Both gated decoupling and inverse gated decoupling routines with a variety of delay times are inspected, in addition to investigation of paramagnetic additives in conjunction with inverse gated decoupling. Once the experiments with the known compound have illuminated the merits of the differing strategies for obtaining a proportional carbon response, a quantitative assessment of an unknown analgesic tablet is undertaken. The amounts of the two major components of the tablet, acetaminophen and aspirin, are determined following addition of an internal standard to the mixture. The carbon resonances emanating from each compound can be identified using spectra of the pure analgesic components and internal standard. Knowing the concentration of the internal standard and assuming a proportional response to all carbons in the sample allows calculation of the amount of both analytes in the analgesic tablets. Data from an initial laboratory trial is presented that illustrates the accuracy of the procedure.

  3. Quantitative analysis of nitrocellulose and pulp in gunpowder by using thermogravimetric analysis/FTIR

    NASA Astrophysics Data System (ADS)

    Johnson, David J.; Compton, David A.

    1989-12-01

    Thermogravimetric Analysis (TGA) has routinely been used to quantitatively determine the presence of a specific component within a material by direct measurement from the weight loss profile. This technique works well when it is known that the detected weight loss was caused only by that component. If more than one material evolves during a single weight loss it is impossible to quantify the contribution of each individual component by using stand-alone TGA. However by coupling an FT-IR to the TGA one may assign evolved gases to a detected weight loss and potentially isolate each iny dividual material. Although a number of gases may evolve during one weight loss, the judicious selection of "Specific Gas Profiles" may allow the experimentalist to isolate each gas. The SGP is a measure of IR absorbance within specific frequency regions as a function of time. Through the use of standards, integration of theseprofiles allows the operator to quantitate the various components in an unknownp. Data from this research will show that nitrocellulose andpulp content in gun powder samples may be measured using the TGA/FT-IR technique.

  4. Development of an apnea detection algorithm based on temporal analysis of thoracic respiratory effort signal

    NASA Astrophysics Data System (ADS)

    Dell'Aquila, C. R.; Cañadas, G. E.; Correa, L. S.; Laciar, E.

    2016-04-01

    This work describes the design of an algorithm for detecting apnea episodes, based on analysis of thorax respiratory effort signal. Inspiration and expiration time, and range amplitude of respiratory cycle were evaluated. For range analysis the standard deviation statistical tool was used over respiratory signal temporal windows. The validity of its performance was carried out in 8 records of Apnea-ECG database that has annotations of apnea episodes. The results are: sensitivity (Se) 73%, specificity (Sp) 83%. These values can be improving eliminating artifact of signal records.

  5. Using principal component analysis to monitor spatial and temporal changes in water quality.

    PubMed

    Bengraïne, Karim; Marhaba, Taha F

    2003-06-27

    Chemical, biological and physical data monitored at 12 locations along the Passaic River, New Jersey, during the year 1998 are analyzed. Principal component analysis (PCA) was used: (i) to extract the factors associated with the hydrochemistry variability; (ii) to obtain the spatial and temporal changes in the water quality. Solute content, temperature, nutrients and organics were the main patterns extracted. The spatial analysis isolated two stations showing a possible point or non-point source of pollution. This study shows the importance of environmental monitoring associated with simple but powerful statistics to better understand a complex water system. PMID:12835021

  6. A Quantitative Analysis of the Extrinsic and Intrinsic Turnover Factors of Relational Database Support Professionals

    ERIC Educational Resources Information Center

    Takusi, Gabriel Samuto

    2010-01-01

    This quantitative analysis explored the intrinsic and extrinsic turnover factors of relational database support specialists. Two hundred and nine relational database support specialists were surveyed for this research. The research was conducted based on Hackman and Oldham's (1980) Job Diagnostic Survey. Regression analysis and a univariate ANOVA…

  7. Integrating Data Analysis (IDA): Working with Sociology Departments to Address the Quantitative Literacy Gap

    ERIC Educational Resources Information Center

    Howery, Carla B.; Rodriguez, Havidan

    2006-01-01

    The NSF-funded Integrating Data Analysis (IDA) Project undertaken by the American Sociological Association (ASA) and the Social Science Data Analysis Network sought to close the quantitative literacy gap for sociology majors. Working with twelve departments, the project built on lessons learned from ASA's Minority Opportunities through School…

  8. A fully automated method for quantitative cerebral hemodynamic analysis using DSC-MRI.

    PubMed

    Bjørnerud, Atle; Emblem, Kyrre E

    2010-05-01

    Dynamic susceptibility contrast (DSC)-based perfusion analysis from MR images has become an established method for analysis of cerebral blood volume (CBV) in glioma patients. To date, little emphasis has, however, been placed on quantitative perfusion analysis of these patients, mainly due to the associated increased technical complexity and lack of sufficient stability in a clinical setting. The aim of our study was to develop a fully automated analysis framework for quantitative DSC-based perfusion analysis. The method presented here generates quantitative hemodynamic maps without user interaction, combined with automatic segmentation of normal-appearing cerebral tissue. Validation of 101 patients with confirmed glioma after surgery gave mean values for CBF, CBV, and MTT, extracted automatically from normal-appearing whole-brain white and gray matter, in good agreement with literature values. The measured age- and gender-related variations in the same parameters were also in agreement with those in the literature. Several established analysis methods were compared and the resulting perfusion metrics depended significantly on method and parameter choice. In conclusion, we present an accurate, fast, and automatic quantitative perfusion analysis method where all analysis steps are based on raw DSC data only. PMID:20087370

  9. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  10. Digitally Enhanced Thin-Layer Chromatography: An Inexpensive, New Technique for Qualitative and Quantitative Analysis

    ERIC Educational Resources Information Center

    Hess, Amber Victoria Irish

    2007-01-01

    A study conducted shows that if digital photography is combined with regular thin-layer chromatography (TLC), it could perform highly improved qualitative analysis as well as make accurate quantitative analysis possible for a much lower cost than commercial equipment. The findings suggest that digitally enhanced TLC (DE-TLC) is low-cost and easy…

  11. A Quantitative Content Analysis of Mercer University MEd, EdS, and Doctoral Theses

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Gaiek, Lura S.; White, Torian A.; Slappey, Lisa A.; Chastain, Andrea; Harris, Rose Prejean

    2010-01-01

    Quantitative content analysis of a body of research not only helps budding researchers understand the culture, language, and expectations of scholarship, it helps identify deficiencies and inform policy and practice. Because of these benefits, an analysis of a census of 980 Mercer University MEd, EdS, and doctoral theses was conducted. Each thesis…

  12. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'.

  13. Revisiting the quantitative features of surface-assisted laser desorption/ionization mass spectrometric analysis.

    PubMed

    Wu, Ching-Yi; Lee, Kai-Chieh; Kuo, Yen-Ling; Chen, Yu-Chie

    2016-10-28

    Surface-assisted laser desorption/ionization (SALDI) coupled with mass spectrometry (MS) is frequently used to analyse small organics owing to its clean background. Inorganic materials can be used as energy absorbers and the transfer medium to facilitate the desorption/ionization of analytes; thus, they are used as SALDI-assisting materials. Many studies have demonstrated the usefulness of SALDI-MS in quantitative analysis of small organics. However, some characteristics occurring in SALDI-MS require certain attention to ensure the reliability of the quantitative analysis results. The appearance of a coffee-ring effect in SALDI sample preparation is the primary factor that can affect quantitative SALDI-MS analysis results. However, to the best of our knowledge, there are no reports relating to quantitative SALDI-MS analysis that discuss or consider this effect. In this study, the coffee-ring effect is discussed using nanoparticles and nanostructured substrates as SALDI-assisting materials to show how this effect influences SALDI-MS analysis results. Potential solutions for overcoming the existing problems are also suggested.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644973

  14. Temporal analysis of laser beam propagation in the atmosphere using computer-generated long phase screens.

    PubMed

    Dios, Federico; Recolons, Jaume; Rodríguez, Alejandro; Batet, Oscar

    2008-02-01

    Temporal analysis of the irradiance at the detector plane is intended as the first step in the study of the mean fade time in a free optical communication system. In the present work this analysis has been performed for a Gaussian laser beam propagating in the atmospheric turbulence by means of computer simulation. To this end, we have adapted a previously known numerical method to the generation of long phase screens. The screens are displaced in a transverse direction as the wave is propagated, in order to simulate the wind effect. The amplitude of the temporal covariance and its power spectrum have been obtained at the optical axis, at the beam centroid and at a certain distance from these two points. Results have been worked out for weak, moderate and strong turbulence regimes and when possible they have been compared with theoretical models. These results show a significant contribution of beam wander to the temporal behaviour of the irradiance, even in the case of weak turbulence. We have also found that the spectral bandwidth of the covariance is hardly dependent on the Rytov variance.

  15. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  16. Some remarks on the quantitative analysis of behavior.

    PubMed

    Marr, M J

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences-physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory.

  17. Some remarks on the quantitative analysis of behavior

    PubMed Central

    Marr, M. Jackson

    1989-01-01

    This paper discusses similarities between the mathematization of operant behavior and the early history of the most mathematical of sciences—physics. Galileo explored the properties of motion without dealing with the causes of motion, focusing on changes in motion. Newton's dynamics were concerned with the action of forces as causes of change. Skinner's rationale for using rate to describe behavior derived from an interest in changes in rate. Reinforcement has played the role of force in the dynamics of behavior. Behavioral momentum and maximization have received mathematical formulations in behavior analysis. Yet to be worked out are the relations between molar and molecular formulations of behavioral theory. PMID:22478028

  18. Spatio-temporal registration in multiplane MRI acquisitions for 3D colon motiliy analysis

    NASA Astrophysics Data System (ADS)

    Kutter, Oliver; Kirchhoff, Sonja; Berkovich, Marina; Reiser, Maximilian; Navab, Nassir

    2008-03-01

    In this paper we present a novel method for analyzing and visualizing dynamic peristaltic motion of the colon in 3D from two series of differently oriented 2D MRI images. To this end, we have defined an MRI examination protocol, and introduced methods for spatio-temporal alignment of the two MRI image series into a common reference. This represents the main contribution of this paper, which enables the 3D analysis of peristaltic motion. The objective is to provide a detailed insight into this complex motion, aiding in the diagnosis and characterization of colon motion disorders. We have applied the proposed spatio-temporal method on Cine MRI data sets of healthy volunteers. The results have been inspected and validated by an expert radiologist. Segmentation and cylindrical approximation of the colon results in a 4D visualization of the peristaltic motion.

  19. Mass Spectrometric Analysis of Spatio-Temporal Dynamics of Crustacean Neuropeptides

    PubMed Central

    OuYang, Chuanzi; Liang, Zhidan; Li, Lingjun

    2014-01-01

    Neuropeptides represent one of the largest classes of signaling molecules used by nervous systems to regulate a wide range of physiological processes. Over the past several years, mass spectrometry (MS)-based strategies have revolutionized the discovery of neuropeptides in numerous model organisms, especially in decapod crustaceans. Here, we focus our discussion on recent advances in the use of MS-based techniques to map neuropeptides in spatial domain and monitoring their dynamic changes in temporal domain. These MS-enabled investigations provide valuable information about the distribution, secretion and potential function of neuropeptides with high molecular specificity and sensitivity. In situ MS imaging and in vivo microdialysis are highlighted as key technologies for probing spatio-temporal dynamics of neuropeptides in the crustacean nervous system. This review summarizes the latest advancement in MS-based methodologies for neuropeptide analysis including typical workflow and sample preparation strategies as well as major neuropeptide families discovered in decapod crustaceans. PMID:25448012

  20. Quantitative proteomic analysis of cold-responsive proteins in rice.

    PubMed

    Neilson, Karlie A; Mariani, Michael; Haynes, Paul A

    2011-05-01

    Rice is susceptible to cold stress and with a future of climatic instability we will be unable to produce enough rice to satisfy increasing demand. A thorough understanding of the molecular responses to thermal stress is imperative for engineering cultivars, which have greater resistance to low temperature stress. In this study we investigated the proteomic response of rice seedlings to 48, 72 and 96 h of cold stress at 12-14°C. The use of both label-free and iTRAQ approaches in the analysis of global protein expression enabled us to assess the complementarity of the two techniques for use in plant proteomics. The approaches yielded a similar biological response to cold stress despite a disparity in proteins identified. The label-free approach identified 236 cold-responsive proteins compared to 85 in iTRAQ results, with only 24 proteins in common. Functional analysis revealed differential expression of proteins involved in transport, photosynthesis, generation of precursor metabolites and energy; and, more specifically, histones and vitamin B biosynthetic proteins were observed to be affected by cold stress. PMID:21433000

  1. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  2. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  3. Fast multidimensional ensemble empirical mode decomposition for the analysis of big spatio-temporal datasets

    PubMed Central

    Wu, Zhaohua; Feng, Jiaxin; Qiao, Fangli; Tan, Zhe-Min

    2016-01-01

    In this big data era, it is more urgent than ever to solve two major issues: (i) fast data transmission methods that can facilitate access to data from non-local sources and (ii) fast and efficient data analysis methods that can reveal the key information from the available data for particular purposes. Although approaches in different fields to address these two questions may differ significantly, the common part must involve data compression techniques and a fast algorithm. This paper introduces the recently developed adaptive and spatio-temporally local analysis method, namely the fast multidimensional ensemble empirical mode decomposition (MEEMD), for the analysis of a large spatio-temporal dataset. The original MEEMD uses ensemble empirical mode decomposition to decompose time series at each spatial grid and then pieces together the temporal–spatial evolution of climate variability and change on naturally separated timescales, which is computationally expensive. By taking advantage of the high efficiency of the expression using principal component analysis/empirical orthogonal function analysis for spatio-temporally coherent data, we design a lossy compression method for climate data to facilitate its non-local transmission. We also explain the basic principles behind the fast MEEMD through decomposing principal components instead of original grid-wise time series to speed up computation of MEEMD. Using a typical climate dataset as an example, we demonstrate that our newly designed methods can (i) compress data with a compression rate of one to two orders; and (ii) speed-up the MEEMD algorithm by one to two orders. PMID:26953173

  4. [Quantitative analysis of seven phenolic acids in eight Yinqiao Jiedu serial preparations by quantitative analysis of multi-components with single-marker].

    PubMed

    Wang, Jun-jun; Zhang, Li; Guo, Qing; Kou, Jun-ping; Yu, Bo-yang; Gu, Dan-hua

    2015-04-01

    The study aims to develop a unified method to determine seven phenolic acids (neochlorogenic acid, chlorogenic acid, 4-caffeoylquinic acid, caffeic acid, isochlorogenic acid B, isochlorogenic acid A and isochlorogenic acid C) contained in honeysuckle flower that is the monarch drug of all the eight Yinqiao Jiedu serial preparations using quantitative analysis of multi-components by single-marker (QAMS). Firstly, chlorogenic acid was used as a reference to get the average relative correction factors (RCFs) of the other phenolic acids in ratios to the reference; columns and instruments from different companies were used to validate the durability of the achieved RCFs in different levels of standard solutions; and honeysuckle flower extract was used as the reference substance to fix the positions of chromatographic peaks. Secondly, the contents of seven phenolic acids in eight different Yinqiao Jiedu serial preparations samples were calculated based on the RCFs durability. Finally, the quantitative results were compared between QAMS and the external standard (ES) method. The results have showed that the durability of the achieved RCFs is good (RSD during 0.80% - 2.56%), and there are no differences between the quantitative results of QAMS and ES (the relative average deviation < 0.93%). So it can be successfully used to the quantitative control of honeysuckle flower principally prescribed in Yinqiao Jiedu serial preparations. PMID:26223132

  5. Quantitative Analysis of Photoactivated Localization Microscopy (PALM) Datasets Using Pair-correlation Analysis

    PubMed Central

    Sengupta, Prabuddha; Lippincott-Schwartz, Jennifer

    2013-01-01

    Pointillistic approach based super-resolution techniques, such as photoactivated localization microscopy (PALM), involve multiple cycles of sequential activation, imaging and precise localization of single fluorescent molecules. A super-resolution image, having nanoscopic structural information, is then constructed by compiling all the image sequences. Because the final image resolution is determined by the localization precision of detected single molecules and their density, accurate image reconstruction requires imaging of biological structures labeled with fluorescent molecules at high density. In such image datasets, stochastic variations in photon emission and intervening dark states lead to uncertainties in identification of single molecules. This, in turn, prevents the proper utilization of the wealth of information on molecular distribution and quantity. A recent strategy for overcoming this problem is pair-correlation analysis applied to PALM. Using rigorous statistical algorithms to estimate the number of detected proteins, this approach allows the spatial organization of molecules to be quantitatively described. PMID:22447653

  6. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  7. Quantitative real-time single particle analysis of virions.

    PubMed

    Heider, Susanne; Metzner, Christoph

    2014-08-01

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed-or adapted from other fields, such as nanotechnology-to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification.

  8. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    SciTech Connect

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  9. Quantitative analysis by mid-infrared spectrometry in food and agro-industrial fields

    NASA Astrophysics Data System (ADS)

    Dupuy, Nathalie; Huvenne, J. P.; Sombret, B.; Legrand, P.

    1993-03-01

    Thanks to what has been achieved by the Fourier transform, infrared spectroscopy can now become a state of the art device in the quality control laboratories if we consider its precision and the gain in time it ensures compared to traditional analysis methods such as HPLC chromatography. Moreover, the increasing number of new mathematical regression methods such as Partial Least Square ( PLS) regression allows the multicomponent quantitative analysis in mixtures. Nevertheless, the efficiency of infrared spectrometry as a quantitative analysis method often depends on the choice of an adequate presentation for the sample. In this document, we shall demonstrate several techniques such as diffuse reflectance and Attenuated Total Reflectance (ATR) which can be according to the various physical states of the mixtures. The quantitative analysis of real samples from the food industry enables us to estimate its precision. For instance, the analysis of the three main components (glucose, fructose and maltose) in the glucose syrups can be done (using ATR) with a precision in the region of 3% whereas the time required to obtain an analysis report is about 5 minutes. Finally multicomponent quantitative analysis is quite feasable by mid-IR spectroscopy.

  10. Interlake production established using quantitative hydrocarbon well-log analysis

    SciTech Connect

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonian Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.

  11. Quantitative analysis of American woodcock nest and brood habitat

    USGS Publications Warehouse

    Bourgeois, A.; Keppie, Daniel M.; Owen, Ray B.

    1977-01-01

    Sixteen nest and 19 brood sites of American woodcock (Philohela minoI) were examined in northern lower Michigan between 15 April and 15 June 1974 to determine habitat structure associated with these sites. Woodcock hens utilized young, second-growth forest stands which were similar in species composition for both nesting and brood rearing. A multi-varIate discriminant function analysis revealed a significant (P< 0.05) difference, however, in habitat structure. Nest habitat was characterized by lower tree density (2176 trees/ha) and basal area (8.6 m2/ha), by being close to forest openings (7 m) and by being situated on dry, relatively well drained sites. In contrast, woodcock broods were located in sites that had nearly twice the tree density (3934 trees/hal and basal area (16.5 m2/ha), was located over twice as far from forest openings (18 m) and generally occurred on damp sites, near (8 m) standing water. Importance of the habitat features to the species and possible management implications are discussed.

  12. Mechanistic insights from a quantitative analysis of pollen tube guidance

    PubMed Central

    2010-01-01

    Background Plant biologists have long speculated about the mechanisms that guide pollen tubes to ovules. Although there is now evidence that ovules emit a diffusible attractant, little is known about how this attractant mediates interactions between the pollen tube and the ovules. Results We employ a semi-in vitro assay, in which ovules dissected from Arabidopsis thaliana are arranged around a cut style on artificial medium, to elucidate how ovules release the attractant and how pollen tubes respond to it. Analysis of microscopy images of the semi-in vitro system shows that pollen tubes are more attracted to ovules that are incubated on the medium for longer times before pollen tubes emerge from the cut style. The responses of tubes are consistent with their sensing a gradient of an attractant at 100-150 μm, farther than previously reported. Our microscopy images also show that pollen tubes slow their growth near the micropyles of functional ovules with a spatial range that depends on ovule incubation time. Conclusions We propose a stochastic model that captures these dynamics. In the model, a pollen tube senses a difference in the fraction of receptors bound to an attractant and changes its direction of growth in response; the attractant is continuously released from ovules and spreads isotropically on the medium. The model suggests that the observed slowing greatly enhances the ability of pollen tubes to successfully target ovules. The relation of the results to guidance in vivo is discussed. PMID:20170550

  13. Quantitative analysis of phenol oxidase activity in insect hemolymph.

    PubMed

    Sorrentino, Richard Paul; Small, Chiyedza N; Govind, Shubha

    2002-04-01

    We describe a simple, inexpensive, and robust protocol for the quantification of phenol oxidase activity in insect hemolymph. Discrete volumes of hemolymph from Drosophila melanogaster larvae are applied to pieces of filter paper soaked in an L-3, 4-dihydroxyphenylalanine (L-DOPA) solution. Phenol oxidase present in the samples catalyzes melanin synthesis from the L-DOPA precursor, resulting in the appearance of a roughly circular melanized spot on the filter paper. The filter paper is then scanned and analyzed with image-processing software. Each pixel in an image is assigned a grayscale value. The mean of the grayscale values for a circular region of pixels at the center of the image of each spot is used to compute a melanization index (MI) value, the computation is based on a comparison to an external standard (India ink). Numerical MI values for control and experimental larvae can then be pooled and subjected to statistical analysis. This protocol was used to evaluate phenol oxidase activity in larvae of different backgrounds: wild-type, lozenge, hopscotch(Tumorous-lethal) (which induces the formation of large melanotic tumors), and body-color mutations ebony and yellow. Our results demonstrate that this assay is sensitive enough for use in genetic screens with D. melanogaster and could conceivably be used for evaluation of MI from hemolymph of other insects.

  14. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  15. Quantitative analysis of a transportable matter-wave gravimeter

    NASA Astrophysics Data System (ADS)

    Desruelle, B.; Le Moigne, N.; Bonvalot, S.; Menoret, V.; Vermeulen, P.; Merlet, S.

    2015-12-01

    This paper summarizes the latest results obtained with our second generation Absolute Quantum Gravimeter (AQG). This instrument relies on the utilization of advanced matter-wave interferometry techniques, which allow us to precisely characterize the vertical acceleration experienced by a cloud of cold atoms over a free-fall of 10 cm. A significant research effort was conducted over the last months to optimize the instrument sensitivity as well as the rejection of ground vibrations, and we will present the technological solutions that were selected to meet our objectives. We will then present a detailed review of the characterizations performed with this instrument. This data shows a very satisfactory sensitivity of the AQG (2 μGal standard deviation after 1000 s of data integration) and a very robust behavior against ground vibrations. We will also present a detailed analysis of the long term behavior of the instrument. These results clearly demonstrate the high potential of matter-wave gravimeter for high performance absolute gravity measurements. Eventually, we will discuss the research activities we are conducting to develop a field version of this instrument.

  16. Direct Quantitative Analysis of Arsenic in Coal Fly Ash

    PubMed Central

    Hartuti, Sri; Kambara, Shinji; Takeyama, Akihiro; Kumabe, Kazuhiro; Moritomi, Hiroshi

    2012-01-01

    A rapid, simple method based on graphite furnace atomic absorption spectrometry is described for the direct determination of arsenic in coal fly ash. Solid samples were directly introduced into the atomizer without preliminary treatment. The direct analysis method was not always free of spectral matrix interference, but the stabilization of arsenic by adding palladium nitrate (chemical modifier) and the optimization of the parameters in the furnace program (temperature, rate of temperature increase, hold time, and argon gas flow) gave good results for the total arsenic determination. The optimal furnace program was determined by analyzing different concentrations of a reference material (NIST1633b), which showed the best linearity for calibration. The optimized parameters for the furnace programs for the ashing and atomization steps were as follows: temperatures of 500–1200 and 2150°C, heating rates of 100 and 500°C s−1, hold times of 90 and 7 s, and medium then maximum and medium argon gas flows, respectively. The calibration plots were linear with a correlation coefficient of 0.9699. This method was validated using arsenic-containing raw coal samples in accordance with the requirements of the mass balance calculation; the distribution rate of As in the fly ashes ranged from 101 to 119%. PMID:23251836

  17. Quantitative real-time single particle analysis of virions

    SciTech Connect

    Heider, Susanne; Metzner, Christoph

    2014-08-15

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed—or adapted from other fields, such as nanotechnology—to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification. - Highlights: • We introduce four methods for virus particle-based quantification of viruses. • They allow for quantification of a wide range of samples in under an hour time. • The additional measurement of size and zeta potential is possible for some.

  18. Quantitative Analysis of the Microstructure of Auxetic Foams

    SciTech Connect

    Gaspar, N.; Smith, C.W.; Miller, E.A.; Seidler, G.T.; Evans, K.E.

    2008-07-28

    The auxetic foams first produced by Lakes have been modelled in a variety of ways, each model trying to reproduce some observed feature of the microscale of the foams. Such features include bent or broken ribs or inverted angles between ribs. These models can reproduce the Poisson's ratio or Poisson's function of auxetic foam if the model parameters are carefully chosen. However these model parameters may not actually reflect the internal structure of the foams. A big problem is that measurement of parameters such as lengths and angles is not straightforward within a 3-d sample. In this work a sample of auxetic foam has been imaged by 3-d X-ray computed tomography. The resulting image is translated to a form that emphasises the geometrical structure of connected ribs. This connected rib data are suitably analysed to describe both the microstructural construction of auxetic foams and the statistical spread of structure, that is, the heterogeneity of an auxetic foam. From the analysis of the microstructure, observations are made about the requirements for microstructural models and comparisons made to previous existing models. From the statistical data, measures of heterogeneity are made that will help with future modelling that includes the heterogeneous aspect of auxetic foams.

  19. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  20. The Quantitative Analysis of the Rotational Spectrum of Ncncs

    NASA Astrophysics Data System (ADS)

    Winnewisser, Manfred; Winnewisser, Brenda P.; Medvedev, Ivan R.; De Lucia, Frank C.; Ross, Stephen C.; Koput, Jacek

    2009-06-01

    The analysis of the rotational data which were the basis of our two previous publications about NCNCS as an example of quantum monodromy has been completed, and the data extended to include the 6th excited state of the quasilinear bending mode. This talk will present the results of fitting the data with the GSRB Hamiltonian, which provides structural and potential parameters. Ab initio calculations contributed some parameters that could not be determined from the data. The predicted variation of the expectation value of ρ, which is the complement of the CNC angle, and of the electric dipole transition moment, upon rovibrational excitation indicate the mapping of monodromy in the potential function into these properties of the molecule. B. P. Winnewisser, M. Winnewisser, I. R. Medvedev, M. Behnke, F. C. De Lucia, S. C. Ross and J. Koput Phys. Rev. Lett. 95 (243002), 2005. M. Winnewisser, B. P. Winnewisser, I. R. Medvedev, F. C. De Lucia, S. C. Ross and L. M. Bates J. Mol. Struct. 798 (1-26), 2006.