Science.gov

Sample records for quantitative temporal analysis

  1. Temporal Kinetics and Quantitative Analysis of Cryptococcus neoformans Nonlytic Exocytosis

    PubMed Central

    Stukes, Sabriya A.; Cohen, Hillel W.

    2014-01-01

    Cryptococcus neoformans is a facultative intracellular pathogen and the causative agent of cryptococcosis, a disease that is often fatal to those with compromised immune systems. C. neoformans has the capacity to escape phagocytic cells through a process known as nonlytic exocytosis whereby the cryptococcal cell is released from the macrophage into the extracellular environment, leaving both the host and pathogen alive. Little is known about the mechanism behind nonlytic exocytosis, but there is evidence that both the fungal and host cells contribute to the process. In this study, we used time-lapse movies of C. neoformans-infected macrophages to delineate the kinetics and quantitative aspects of nonlytic exocytosis. We analyzed approximately 800 macrophages containing intracellular C. neoformans and identified 163 nonlytic exocytosis events that were further characterized into three subcategories: type I (complete emptying of macrophage), type II (partial emptying of macrophage), and type III (cell-to-cell transfer). The majority of type I and II events occurred after several hours of intracellular residence, whereas type III events occurred significantly (P < 0.001) earlier in the course of macrophage infection. Our results show that nonlytic exocytosis is a morphologically and temporally diverse process that occurs relatively rapidly in the course of macrophage infection. PMID:24595144

  2. Quantitative analysis of structural neuroimaging of mesial temporal lobe epilepsy

    PubMed Central

    Memarian, Negar; Thompson, Paul M; Engel, Jerome; Staba, Richard J

    2013-01-01

    Mesial temporal lobe epilepsy (MTLE) is the most common of the surgically remediable drug-resistant epilepsies. MRI is the primary diagnostic tool to detect anatomical abnormalities and, when combined with EEG, can more accurately identify an epileptogenic lesion, which is often hippocampal sclerosis in cases of MTLE. As structural imaging technology has advanced the surgical treatment of MTLE and other lesional epilepsies, so too have the analysis techniques that are used to measure different structural attributes of the brain. These techniques, which are reviewed here and have been used chiefly in basic research of epilepsy and in studies of MTLE, have identified different types and the extent of anatomical abnormalities that can extend beyond the affected hippocampus. These results suggest that structural imaging and sophisticated imaging analysis could provide important information to identify networks capable of generating spontaneous seizures and ultimately help guide surgical therapy that improves postsurgical seizure-freedom outcomes. PMID:24319498

  3. Quantitative and qualitative analysis of semantic verbal fluency in patients with temporal lobe epilepsy.

    PubMed

    Jaimes-Bautista, A G; Rodríguez-Camacho, M; Martínez-Juárez, I E; Rodríguez-Agudelo, Y

    2017-08-29

    Patients with temporal lobe epilepsy (TLE) perform poorly on semantic verbal fluency (SVF) tasks. Completing these tasks successfully involves multiple cognitive processes simultaneously. Therefore, quantitative analysis of SVF (number of correct words in one minute), conducted in most studies, has been found to be insufficient to identify cognitive dysfunction underlying SVF difficulties in TLE. To determine whether a sample of patients with TLE had SVF difficulties compared with a control group (CG), and to identify the cognitive components associated with SVF difficulties using quantitative and qualitative analysis. SVF was evaluated in 25 patients with TLE and 24 healthy controls; the semantic verbal fluency test included 5 semantic categories: animals, fruits, occupations, countries, and verbs. All 5 categories were analysed quantitatively (number of correct words per minute and interval of execution: 0-15, 16-30, 31-45, and 46-60seconds); the categories animals and fruits were also analysed qualitatively (clusters, cluster size, switches, perseverations, and intrusions). Patients generated fewer words for all categories and intervals and fewer clusters and switches for animals and fruits than the CG (P<.01). Differences between groups were not significant in terms of cluster size and number of intrusions and perseverations (P>.05). Our results suggest an association between SVF difficulties in TLE and difficulty activating semantic networks, impaired strategic search, and poor cognitive flexibility. Attention, inhibition, and working memory are preserved in these patients. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Quantitative analysis of Neanderthal temporal bone morphology using three-dimensional geometric morphometrics.

    PubMed

    Harvati, Katerina

    2003-04-01

    The temporal bone is the location of several traits thought to differentiate Neanderthals from modern humans, including some proposed Neanderthal-derived traits. Most of these, however, are difficult to measure and are usually described qualitatively. This study applied the techniques of geometric morphometrics to the complex morphology of the temporal bone, in order to quantify the differences observed between Neanderthal and modern human anatomy. Two hundred and seventy modern human crania were measured, representing 9 populations of 30 individuals each, and spanning the extremes of the modern human geographical range. Twelve Neanderthal specimens, as well as Reilingen, Kabwe, Skhul 5, Qafzeh 9, and 4 Late Paleolithic European specimens, were included in the fossil sample. The data were collected in the form of three-dimensional (3-D) landmark coordinates, and specimen configurations were superimposed using generalized Procrustes analysis. The fitted coordinates were then analyzed by an array of multivariate statistical methods, including principal components analysis, canonical variates analysis, and Mahalanobis D(2). The temporal bone landmark analysis was very successful in separating Neanderthals from modern humans. Neanderthals were separated from modern humans in both the principal components and canonical variates analyses. They were much further in Mahalanobis distances from all modern human populations than any two modern human groups were from each other. Most of the previously described temporal bone traits contributed to this separation. Copyright 2003 Wiley-Liss, Inc.

  5. [18F]FDG-PET reveals temporal hypometabolism in patients with temporal lobe epilepsy even when quantitative MRI and histopathological analysis show only mild hippocampal damage.

    PubMed

    Lamusuo, S; Jutila, L; Ylinen, A; Kälviäinen, R; Mervaala, E; Haaparanta, M; Jääskeläinen, S; Partanen, K; Vapalahti, M; Rinne, J

    2001-06-01

    The relationship between reduced glucose metabolism in positron emission tomography with fludeoxyglucose F 18 ([(18)F]FDG-PET) and hippocampal damage (HD) in patients with temporal lobe epilepsy is still unclear. To determine whether the presence and severity of HD verified by quantitative magnetic resonance imaging (QMRI) and histopathological analysis affect the degree of hypometabolism. Sixteen patients with drug-resistant temporal lobe epilepsy underwent [(18)F]FDG-PET and QMRI (hippocampal volumetry and T2 relaxometry) before surgery. Histopathological analysis of the hippocampus included measurements of neuronal loss, proliferation of glial cells, and mossy fiber sprouting. The asymmetry in glucose metabolism described the degree of hypometabolism. Temporal hypometabolism was not related to severity of HD as measured by QMRI or histopathological analysis. The degree of hypometabolism did not differ in patients with mild, moderate, or severe HD. In addition, [(18)F]FDG-PET revealed significant temporal hypometabolism even though hippocampal QMRI findings were normal or showed only mild HD. Thus, glucose consumption was reduced over and above the histopathological changes. [(18)F]FDG-PET is sensitive for localizing the epileptogenic region in patients with temporal lobe epilepsy. However, it is insensitive to reflect the severity of HD.

  6. Quantitative Spatial and Temporal Analysis of Fluorescein Angiography Dynamics in the Eye

    PubMed Central

    Hui, Flora; Nguyen, Christine T. O.; Bedggood, Phillip A.; He, Zheng; Fish, Rebecca L.; Gurrell, Rachel; Vingrys, Algis J.; Bui, Bang V.

    2014-01-01

    Purpose We describe a novel approach to analyze fluorescein angiography to investigate fluorescein flow dynamics in the rat posterior retina as well as identify abnormal areas following laser photocoagulation. Methods Experiments were undertaken in adult Long Evans rats. Using a rodent retinal camera, videos were acquired at 30 frames per second for 30 seconds following intravenous introduction of sodium fluorescein in a group of control animals (n = 14). Videos were image registered and analyzed using principle components analysis across all pixels in the field. This returns fluorescence intensity profiles from which, the half-rise (time to 50% brightness), half-fall (time for 50% decay) back to an offset (plateau level of fluorescence). We applied this analysis to video fluorescein angiography data collected 30 minutes following laser photocoagulation in a separate group of rats (n = 7). Results Pixel-by-pixel analysis of video angiography clearly delineates differences in the temporal profiles of arteries, veins and capillaries in the posterior retina. We find no difference in half-rise, half-fall or offset amongst the four quadrants (inferior, nasal, superior, temporal). We also found little difference with eccentricity. By expressing the parameters at each pixel as a function of the number of standard deviation from the average of the entire field, we could clearly identify the spatial extent of the laser injury. Conclusions This simple registration and analysis provides a way to monitor the size of vascular injury, to highlight areas of subtle vascular leakage and to quantify vascular dynamics not possible using current fluorescein angiography approaches. This can be applied in both laboratory and clinical settings for in vivo dynamic fluorescent imaging of vasculature. PMID:25365578

  7. Quantitative temporal proteomic analysis of human embryonic stem cell differentiation into oligodendrocyte progenitor cells

    PubMed Central

    Chaerkady, Raghothama; Letzen, Brian; Renuse, Santosh; Sahasrabuddhe, Nandini A.; Kumar, Praveen; All, Angelo H.; Thakor, Nitish V.; Delanghe, Bernard; Gearhart, John D.; Pandey, Akhilesh; Kerr, Candace L.

    2013-01-01

    Oligodendrocytes (OLs) are glial cells of the central nervous system which produce myelin. Cultured OLs provide immense therapeutic opportunities for treating a variety of neurological conditions. One of the most promising sources for such therapies is human embryonic stem cells (ESCs), as well as providing a model to study human oligodendrocyte development. For these purposes, an investigation of proteome level changes is critical for understanding the process of OL differentiation. In this report, an iTRAQ-based quantitative proteomic approach was used to study multiple steps during oligodendrocyte differentiation including neural precursors (NPCs), glial precursors (GPCs), and oligodendrocyte progenitors (OPCs) compared to undifferentiated embryonic stem cells. Using a 1% false discovery rate cutoff, ~3,145 proteins were quantitated and several demonstrated progressive stage-specific expression. Proteins such as TF, NCAM1, APOE, and WNT5A showed increased expression from the NPC to OPC stage. Several proteins that have demonstrated evidence or been suspected in OL maturation were also found upregulated in OPCs including FABP4, THBS1, BMP1, CRYAB, TF, TNC, COL3A1, TGFBI and EPB41L3. Thus, by providing the first extensive proteomic profiling of human embryonic stem cell differentiation into oligodendrocyte progenitor cells, this study provides many novel proteins that are potentially involved in OL development. PMID:21770034

  8. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  9. A semi-quantitative GeLC-MS analysis of temporal proteome expression in the emerging nosocomial pathogen Ochrobactrum anthropi

    PubMed Central

    Graham, Robert Leslie James; Sharma, Mohit K; Ternan, Nigel G; Weatherly, D Brent; Tarleton, Rick L; McMullan, Geoff

    2007-01-01

    Background The α-Proteobacteria are capable of interaction with eukaryotic cells, with some members, such as Ochrobactrum anthropi, capable of acting as human pathogens. O. anthropi has been the cause of a growing number of hospital-acquired infections; however, little is known about its growth, physiology and metabolism. We used proteomics to investigate how protein expression of this organism changes with time during growth. Results This first gel-based liquid chromatography-mass spectrometry (GeLC-MS) temporal proteomic analysis of O. anthropi led to the positive identification of 131 proteins. These were functionally classified and physiochemically characterized. Utilizing the emPAI protocol to estimate protein abundance, we assigned molar concentrations to all proteins, and thus were able to identify 19 with significant changes in their expression. Pathway reconstruction led to the identification of a variety of central metabolic pathways, including nucleotide biosynthesis, fatty acid anabolism, glycolysis, TCA cycle and amino acid metabolism. In late phase growth we identified a number of gene products under the control of the oxyR regulon, which is induced in response to oxidative stress and whose protein products have been linked with pathogen survival in response to host immunity reactions. Conclusion This study identified distinct proteomic profiles associated with specific growth points for O. anthropi, while the use of emPAI allowed semi-quantitative analyses of protein expression. It was possible to reconstruct central metabolic pathways and infer unique functional and adaptive processes associated with specific growth phases, thereby resulting in a deeper understanding of the physiology and metabolism of this emerging pathogenic bacterium. PMID:17567905

  10. Quantitative kinetic analysis of lung nodules using the temporal subtraction technique in dynamic chest radiographies performed with a flat panel detector.

    PubMed

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2009-04-01

    Early detection and treatment of lung cancer is one of the most effective means of reducing cancer mortality, and to this end, chest X-ray radiography has been widely used as a screening method. A related technique based on the development of computer analysis and a flat panel detector (FPD) has enabled the functional evaluation of respiratory kinetics in the chest and is expected to be introduced into clinical practice in the near future. In this study, we developed a computer analysis algorithm to detect lung nodules and to evaluate quantitative kinetics. Breathing chest radiographs obtained by modified FPD and breath synchronization utilizing diaphragmatic analysis of vector movement were converted into four static images by sequential temporal subtraction processing, morphological enhancement processing, kinetic visualization processing, and lung region detection processing. An artificial neural network analyzed these density patterns to detect the true nodules and draw their kinetic tracks. Both the algorithm performance and the evaluation of clinical effectiveness of seven normal patients and simulated nodules showed sufficient detecting capability and kinetic imaging function without significant differences. Our technique can quantitatively evaluate the kinetic range of nodules and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  11. Z-score-based semi-quantitative analysis of the volume of the temporal horn of the lateral ventricle on brain CT images.

    PubMed

    Takahashi, Noriyuki; Kinoshita, Toshibumi; Ohmura, Tomomi; Lee, Yongbum; Matsuyama, Eri; Toyoshima, Hideto; Tsai, Du-Yih

    2016-01-01

    The volume of the temporal horn of the lateral ventricle (THLV) on brain computed tomography (CT) images is important for neurologic diagnosis. Our purpose in this study was to develop a z-score-based semi-quantitative analysis for estimation of the THLV volume by using voxel-based morphometry. The THLV volume was estimated by use of a z-score mapping method that consisted of four main steps: anatomic standardization, construction of a normal reference database, calculation of the z score, and calculation of the mean z score in a volume of interest (VOI). A mean z score of the CT value obtained from a VOI around the THLV was used as an index for the THLV volume. CT scans from 50 subjects were evaluated. For evaluation of the accuracy of this method for estimating the THLV volume, the THLV volume was determined manually by neuroradiologists (serving as the reference volume). A mean z score was calculated from the VOI for each THLV of the 50 subjects by use of the proposed method. The accuracy of this method was evaluated by use of the relationship between the mean z score and the reference volume. The quadratic polynomial regression equation demonstrated a statistically significant correlation between the mean z score and the reference volume of the THLV (R (2) = 0.94; P < 0.0001). In 92 of 100 THLVs (92 %), the 95 % prediction interval of the regional mean z score captured the reference volume of the THLV. The z-score-based semi-quantitative analysis has the potential quantitatively to estimate the THLV volume on CT images.

  12. Quantitative temporal analysis of /sup 99m/Technetium p-isopropyl-iminodiacetic acid (PIPIDA) as a measure of hepatic function in health and disease

    SciTech Connect

    Joshi, S.N.; George, E.A.; Perrillo, R.P.

    1981-01-01

    Excretory liver function was analyzed in 10 healthy volunteers and 28 subjects with acute or chronic liver injury following intravenous administration of /sup 99m/technetium p-isopropyl iminodiacetic acid. Hepatobiliary transit of this agent was quantitated at 5-min intervals for a total of 60 min. Indices of total liver activity, liver cell uptake, liver parenchymal clearance, and bile duct clearance of /sup 99m/technetium p-isopropyl iminodiacetic acid were calculated from time--activity curves over heart, liver, extrahepatic bile ducts, and gallbladder. Seven subjects with acute viral hepatitis, 15 with extrahepatic biliary obstruction, and 6 with intrahepatic cholestasis were evaluated. Compared with healthy volunteers, a significant (p less than 0.0001) reduction in total liver activity and liver parenchymal clearance was demonstrated in all patient groups. Major resolution in all liver-derived indices, particularly total liver activity, occurred during convalescence from hepatitis and after biliary drainage. Nonmeasurable bile duct clearance always indicated a diagnosis of extrahepatic obstruction in cholestatic subjects, and this index normalized in subjects following biliary drainage. Whereas visual assessment of /sup 99m/technetium p-isopropyl iminodiacetic acid scans provided limited, useful information about the functional status of the liver, quantitative temporal analysis proved to be a much more effective technique.

  13. Temporal mapping and analysis

    NASA Technical Reports Server (NTRS)

    O'Hara, Charles G. (Inventor); Shrestha, Bijay (Inventor); Vijayaraj, Veeraraghavan (Inventor); Mali, Preeti (Inventor)

    2011-01-01

    A compositing process for selecting spatial data collected over a period of time, creating temporal data cubes from the spatial data, and processing and/or analyzing the data using temporal mapping algebra functions. In some embodiments, the temporal data cube is creating a masked cube using the data cubes, and computing a composite from the masked cube by using temporal mapping algebra.

  14. Dynamic contrast-enhanced MRI of the prostate: An intraindividual assessment of the effect of temporal resolution on qualitative detection and quantitative analysis of histopathologically proven prostate cancer.

    PubMed

    Ream, Justin M; Doshi, Ankur M; Dunst, Diane; Parikh, Nainesh; Kong, Max X; Babb, James S; Taneja, Samir S; Rosenkrantz, Andrew B

    2017-05-01

    To assess the effects of temporal resolution (RT ) in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) on qualitative tumor detection and quantitative pharmacokinetic parameters in prostate cancer. This retrospective Institutional Review Board (IRB)-approved study included 58 men (64 ± 7 years). They underwent 3T prostate MRI showing dominant peripheral zone (PZ) tumors (24 with Gleason ≥ 4 + 3), prior to prostatectomy. Continuously acquired DCE utilizing GRASP (Golden-angle RAdial Sparse Parallel) was retrospectively reconstructed at RT of 1.4 sec, 3.7 sec, 6.0 sec, 9.7 sec, and 14.9 sec. A reader placed volumes-of-interest on dominant tumors and benign PZ, generating quantitative pharmacokinetic parameters (k(trans) , ve ) at each RT . Two blinded readers assessed each RT for lesion presence, location, conspicuity, and reader confidence on a 5-point scale. Data were assessed by mixed-model analysis of variance (ANOVA), generalized estimating equation (GEE), and receiver operating characteristic (ROC) analysis. RT did not affect sensitivity (R1all : 69.0%-72.4%, all Padj  = 1.000; R1GS≥4 + 3 : 83.3-91.7%, all Padj  = 1.000; R2all : 60.3-69.0%, all Padj  = 1.000; R2GS≥4 + 3 : 58.3%-79.2%, all Padj  = 1.000). R1 reported greater conspicuity of GS ≥ 4 + 3 tumors at RT of 1.4 sec vs. 14.9 sec (4.29 ± 1.23 vs. 3.46 ± 1.44; Padj  = 0.029). No other tumor conspicuity pairwise comparison reached significance (R1all : 2.98-3.43, all Padj ≥ 0.205; R2all : 2.57-3.19, all Padj ≥ 0.059; R1GS≥4 + 3 : 3.46-4.29, all other Padj ≥ 0.156; R2GS≥4 + 3 : 2.92-3.71, all Padj ≥ 0.439). There was no effect of RT on reader confidence (R1all : 3.17-3.34, all Padj  = 1.000; R2all : 2.83-3.19, all Padj ≥ 0.801; R1GS≥4 + 3 : 3.79-4.21, all Padj  = 1.000; R2GS≥4 + 3 : 3.13-3.79, all Padj  = 1.000). k(trans) and ve of tumor and benign tissue did not differ across RT (all

  15. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  16. Time of flight mass spectrometry for quantitative data analysis in fast transient studies using a Temporal Analysis of Products (TAP) reactor.

    PubMed

    Goguet, Alexandre; Hardacre, Christopher; Maguire, Noleen; Morgan, Kevin; Shekhtman, Sergiy O; Thompson, Steve P

    2011-01-07

    A Time of flight (ToF) mass spectrometer suitable in terms of sensitivity, detector response and time resolution, for application in fast transient Temporal Analysis of Products (TAP) kinetic catalyst characterization is reported. Technical difficulties associated with such application as well as the solutions implemented in terms of adaptations of the ToF apparatus are discussed. The performance of the ToF was validated and the full linearity of the specific detector over the full dynamic range was explored in order to ensure its applicability for the TAP application. The reported TAP-ToF setup is the first system that achieves the high level of sensitivity allowing monitoring of the full 0-200 AMU range simultaneously with sub-millisecond time resolution. In this new setup, the high sensitivity allows the use of low intensity pulses ensuring that transport through the reactor occurs in the Knudsen diffusion regime and that the data can, therefore, be fully analysed using the reported theoretical TAP models and data processing.

  17. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  18. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  19. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  20. A Temporal View of Soft Tissue Quantitative Ultrasound

    NASA Astrophysics Data System (ADS)

    O'Brien, William D.

    The objective of soft tissue quantitative ultrasound (QUS) is to improve diagnostic ultrasound imaging capabilities via quantitative outcomes. Over the past three or so decades, there have been an increasing number of QUS successes. A temporal view moves us back in history almost six decades when techniques and theoretical developments were in their earliest stages that impacted modern QUS successes. The earliest theoretical developments and techniques some six decades ago can be attributed to Lev Chernov, Philip Morse, Herman Feshbach, Uno Ingard, John Wild and Jack Reid. Later, Floyd Dunn developed important views as to how connective tissue affected the interaction between ultrasound and soft tissue. Then, as the theory of wave propagation in soft tissues with random inhomogeneities was extended and applied by Fred Lizzi, Jim Zagzebski and Mike Insana (and their colleagues), contemporary QUS successes started to emerge.

  1. Temporal detection and analysis of guideline interactions.

    PubMed

    Anselma, Luca; Piovesan, Luca; Terenziani, Paolo

    2017-02-01

    Clinical practice guidelines (CPGs) are assuming a major role in the medical area, to grant the quality of medical assistance, supporting physicians with evidence-based information of interventions in the treatment of single pathologies. The treatment of patients affected by multiple diseases (comorbid patients) is one of the main challenges for the modern healthcare. It requires the development of new methodologies, supporting physicians in the treatment of interactions between CPGs. Several approaches have started to face such a challenging problem. However, they suffer from a substantial limitation: they do not take into account the temporal dimension. Indeed, practically speaking, interactions occur in time. For instance, the effects of two actions taken from different guidelines may potentially conflict, but practical conflicts happen only if the times of execution of such actions are such that their effects overlap in time. We aim at devising a methodology to detect and analyse interactions between CPGs that considers the temporal dimension. In this paper, we first extend our previous ontological model to deal with the fact that actions, goals, effects and interactions occur in time, and to model both qualitative and quantitative temporal constraints between them. Then, we identify different application scenarios, and, for each of them, we propose different types of facilities for user physicians, useful to support the temporal detection of interactions. We provide a modular approach in which different Artificial Intelligence temporal reasoning techniques, based on temporal constraint propagation, are widely exploited to provide users with such facilities. We applied our methodology to two cases of comorbidities, using simplified versions of CPGs. We propose an innovative approach to the detection and analysis of interactions between CPGs considering different sources of temporal information (CPGs, ontological knowledge and execution logs), which is the first

  2. Quantitative analysis of land use and land cover changes from the multi-temporal remote sensing data in the Bosten Lake Basin, Chinese Tian Shan

    NASA Astrophysics Data System (ADS)

    Maimaiti, A.; Wang, L. M.; Yan, F.; Zhang, J.; Ma, Y. X.

    2017-07-01

    In this paper, the spatial changes of land use and land cover (LUCC) in Bosten Lake Basin from 1985 to 2015 were analyzed, based on the 3S technology and four periods of remote sensing images in 1985, 1998, 2008 and 2015. The driving force of the LUCC was analyzed quantitatively using principal component analysis (PCA) method. The results showed that there was a wide change of LUCC in the Bosten Lake Basin in the past 30 years. During this periods, natural grassland decreased with a rate of 62.6 km2/a, while cultivated land and residential land increased with a rate of 28.9 km2/a. The rapid expansion of cultivated land was the result of natural grassland and unused land reclamation. Meanwhile, other land use types changed slightly. The results of PCA analysis indicated that the LUCC change was result of the interaction of human activity (social and economic factors) and natural environmental changes (climate change). Therefore, it is necessary to consistently improve the natural environment of Bosten Lake Basin. This study can provide stable basis of the theory and practice for sustainable development of Bosten Lake Basin.

  3. Semi-quantitative analysis of perfusion of Brodmann areas in the differential diagnosis of cognitive impairment in Alzheimer's disease, fronto-temporal dementia and mild cognitive impairment.

    PubMed

    Tranfaglia, Cristina; Palumbo, Barbara; Siepi, Donatella; Sinzinger, Helmut; Parnetti, Lucilla

    2009-01-01

    Different perfusion defects reflect neurological damage characteristics of different kinds of dementia. Our aim was to investigate the role of brain single photon emission tomography (SPET) with semiquantitative analysis of Brodmann areas in dementia, by technetium-99m - hexamethyl-propylenamine- oxime ((99m)Tc-HMPAO) brain SPET with semiquantitative analysis of Brodmann areas in patients with Alzheimer's disease (AD), frontotemporal dementia (FTD) and mild cognitive impairment (MCI). We studied 75 patients, 25 with AD (NiNCDS ADRDA criteria), 25 with FTD (Lund and Manchester criteria), 25 with MCI (EADC criteria). After i.v. injection of 740MBq of (99m)Tc-HMPAO, each patient underwent brain SPET. A software application was used able to map the SPET brain image to a stereotaxic atlas (Talairach), providing an affine co-registration by blocks of data defined in the Talairach space. A normal database calculating voxel by voxel the mean and the standard deviation of the measured values was built. Functional SPET data of 3D regions of interest (ROI) of predefined Brodmann's area templates were compared with those of a database of healthy subjects of the same age and gender. Mean values obtained in the Brodmann area ROI in the different groups of patients studied were evaluated. Our results showed that different Brodmann areas were significantly impaired in the different categories of dementia subjects. Both areas 37 (temporal gyrus) and 39 (angular gyrus) of AD patients (mean+/-SD: 37L= -1.6+/-1.0; 37R= -1.5+/-1.1; 39L= -2.3+/-1.3; 39R= -1.9+/-1.2) showed significant hypoperfusion (P<0.05) versus MCI (37L= -0.9 +/-0.7; 37R= -1.1+/-0.9; 39L= -1.4+/-1.1; 39R= -1.6+/-1.6.) and FTD (37L= -1.1+/-0.8; 37R= -1.0+/-0.9; 39L= -1.4+/-1.0; 39R= -1.2+/-1.2) subjects. AD patients showed significantly (P<0.01) decreased perfusion in areas 40 (supramarginal gyrus) (40L= -2.6+/-1.0; 40R= -2.3+/-1.1) with respect to MCI patients (40L= -1.8+/-0.9; 40R= -1.7+/-1.2). Finally, FTD patients

  4. Quantitative methods for stochastic high frequency spatio-temporal and non-linear analysis: Assessing health effects of exposure to extreme ambient temperature

    NASA Astrophysics Data System (ADS)

    Liss, Alexander

    Extreme weather events, such as heat waves and cold spells, cause substantial excess mortality and morbidity in the vulnerable elderly population, and cost billions of dollars. The accurate and reliable assessment of adverse effects of extreme weather events on human health is crucial for environmental scientists, economists, and public health officials to ensure proper protection of vulnerable populations and efficient allocation of scarce resources. However, the methodology for the analysis of large national databases is yet to be developed. The overarching objective of this dissertation is to examine the effect of extreme weather on the elderly population of the Conterminous US (ConUS) with respect to seasonality in temperature in different climatic regions by utilizing heterogeneous high frequency and spatio-temporal resolution data. To achieve these goals the author: 1) incorporated dissimilar stochastic high frequency big data streams and distinct data types into the integrated data base for use in analytical and decision support frameworks; 2) created an automated climate regionalization system based on remote sensing and machine learning to define climate regions for the Conterminous US; 3) systematically surveyed the current state of the art and identified existing gaps in the scientific knowledge; 4) assessed the dose-response relationship of exposure to temperature extremes on human health in relatively homogeneous climate regions using different statistical models, such as parametric and non-parametric, contemporaneous and asynchronous, applied to the same data; 5) assessed seasonal peak timing and synchronization delay of the exposure and the disease within the framework of contemporaneous high frequency harmonic time series analysis and modification of the effect by the regional climate; 6) modeled using hyperbolic functional form non-linear properties of the effect of exposure to extreme temperature on human health. The proposed climate

  5. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  6. Quantitative analysis of PET studies.

    PubMed

    Weber, Wolfgang A

    2010-09-01

    Quantitative analysis can be included relatively easily in clinical PET-imaging protocols, but in order to obtain meaningful quantitative results one needs to follow a standardized protocol for image acquisition and data analysis. Important factors to consider are the calibration of the PET scanner, the radiotracer uptake time and the approach for definition of regions of interests. Using such standardized acquisition protocols quantitative parameters of tumor metabolism or receptor status can be derived from tracer kinetic analysis and simplified approaches such as calculation of standardized uptake values (SUVs).

  7. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization

    PubMed Central

    Hill, Cheryl A.; Richtsmeier, Joan T.

    2010-01-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  8. Quantitative analysis in megageomorphology

    NASA Technical Reports Server (NTRS)

    Mayer, L.

    1985-01-01

    Megageomorphology is the study of regional topographic features and their relations to independent geomorphic variables that operate at the regional scale. These independent variables can be classified as either tectonic or climatic in nature. Quantitative megageomorphology stresses the causal relations between plate tectonic factors and landscape features or correlations between climatic factors and geomorphic processes. In addition, the cumulative effects of tectonics and climate on landscape evolution that simultaneously operate in a complex system of energy transfer is of interst. Regional topographic differentiation, say between continents and ocean floors, is largely the result of the different densities and density contrasts within the oceanic and continental lithosphere and their isostatic consequences. Regional tectonic processes that alter these lithospheric characteristics include rifting, collision, subduction, transpression and transtension.

  9. Quantitative iTRAQ-based secretome analysis reveals species-specific and temporal shifts in carbon utilization strategies among manganese(II)-oxidizing Ascomycete fungi

    DOE PAGES

    Zeiner, Carolyn A.; Purvine, Samuel O.; Zink, Erika M.; ...

    2017-07-01

    Fungi generate a wide range of extracellular hydrolytic and oxidative enzymes and reactive metabolites, collectively known as the secretome, that synergistically drive plant litter decomposition in the environment. While secretome studies of model organisms have greatly expanded our knowledge of these enzymes, few have extended secretome characterization to environmental isolates, particularly filamentous Ascomycetes, or directly compared temporal patterns of enzyme utilization among diverse species. Thus, the mechanisms of carbon (C) degradation by many ubiquitous soil fungi remain poorly understood. Here in this study we use a combination of iTRAQ proteomics and extracellular enzyme activity assays to compare the protein compositionmore » of the secretomes of four manganese(II)-oxidizing Ascomycete fungi over a three-week time course. We demonstrate that the fungi exhibit striking differences in the regulation of extracellular lignocellulose-degrading enzymes among species and over time, revealing species-specific and temporal shifts in C utilization strategies as they degrade the same substrate. Specifically, our findings suggest that Alternaria alternata SRC1lrK2f and Paraconiothyrium sporulosum AP3s5-JAC2a employ sequential enzyme secretion patterns concomitant with decreasing resource availability. Stagonospora sp. SRC1lsM3a preferentially degrades proteinaceous substrate before switching to carbohydrates, and Pyrenochaeta sp. DS3sAY3a utilizes primarily peptidases to aggressively attack carbon sources in a concentrated burst. In conclusion, this work highlights the diversity of operative metabolic strategies among understudied yet ubiquitous cellulose-degrading Ascomycetes, enhancing our understanding of their contribution to C turnover in the environment.« less

  10. Quantitative Characterization of Tissue Microstructure with Temporal Diffusion Spectroscopy

    PubMed Central

    Xu, Junzhong; Does, Mark D.; Gore, John C.

    2009-01-01

    The signals recorded by diffusion-weighted magnetic resonance imaging (DWI) are dependent on the micro-structural properties of biological tissues, so it is possible to obtain quantitative structural information non-invasively from such measurements. Oscillating gradient spin echo (OGSE) methods have the ability to probe the behavior of water diffusion over different time scales and the potential to detect variations in intracellular structure. To assist in the interpretation of OGSE data, analytical expressions have been derived for diffusion-weighted signals with OGSE methods for restricted diffusion in some typical structures, including parallel planes, cylinders and spheres, using the theory of temporal diffusion spectroscopy. These analytical predictions have been confirmed with computer simulations. These expressions suggest how OGSE signals from biological tissues should be analyzed to characterize tissue microstructure, including how to estimate cell nuclear sizes. This approach provides a model to interpret diffusion data obtained from OGSE measurements that can be used for applications such as monitoring tumor response to treatment in vivo. PMID:19616979

  11. Neurochemical phenotype of corticocortical connections in the macaque monkey: quantitative analysis of a subset of neurofilament protein-immunoreactive projection neurons in frontal, parietal, temporal, and cingulate cortices

    NASA Technical Reports Server (NTRS)

    Hof, P. R.; Nimchinsky, E. A.; Morrison, J. H.; Bloom, F. E. (Principal Investigator)

    1995-01-01

    The neurochemical characteristics of the neuronal subsets that furnish different types of corticocortical connections have been only partially determined. In recent years, several cytoskeletal proteins have emerged as reliable markers to distinguish subsets of pyramidal neurons in the cerebral cortex of primates. In particular, previous studies using an antibody to nonphosphorylated neurofilament protein (SMI-32) have revealed a consistent degree of regional and laminar specificity in the distribution of a subpopulation of pyramidal cells in the primate cerebral cortex. The density of neurofilament protein-immunoreactive neurons was shown to vary across corticocortical pathways in macaque monkeys. In the present study, we have used the antibody SMI-32 to examine further and to quantify the distribution of a subset of corticocortically projecting neurons in a series of long ipsilateral corticocortical pathways in comparison to short corticocortical, commissural, and limbic connections. The results demonstrate that the long association pathways interconnecting the frontal, parietal, and temporal neocortex have a high representation of neurofilament protein-enriched pyramidal neurons (45-90%), whereas short corticocortical, callosal, and limbic pathways are characterized by much lower numbers of such neurons (4-35%). These data suggest that different types of corticocortical connections have differential representation of highly specific neuronal subsets that share common neurochemical characteristics, thereby determining regional and laminar cortical patterns of morphological and molecular heterogeneity. These differences in neuronal neurochemical phenotype among corticocortical circuits may have considerable influence on cortical processing and may be directly related to the type of integrative function subserved by each cortical pathway. Finally, it is worth noting that neurofilament protein-immunoreactive neurons are dramatically affected in the course of

  12. Neurochemical phenotype of corticocortical connections in the macaque monkey: quantitative analysis of a subset of neurofilament protein-immunoreactive projection neurons in frontal, parietal, temporal, and cingulate cortices

    NASA Technical Reports Server (NTRS)

    Hof, P. R.; Nimchinsky, E. A.; Morrison, J. H.; Bloom, F. E. (Principal Investigator)

    1995-01-01

    The neurochemical characteristics of the neuronal subsets that furnish different types of corticocortical connections have been only partially determined. In recent years, several cytoskeletal proteins have emerged as reliable markers to distinguish subsets of pyramidal neurons in the cerebral cortex of primates. In particular, previous studies using an antibody to nonphosphorylated neurofilament protein (SMI-32) have revealed a consistent degree of regional and laminar specificity in the distribution of a subpopulation of pyramidal cells in the primate cerebral cortex. The density of neurofilament protein-immunoreactive neurons was shown to vary across corticocortical pathways in macaque monkeys. In the present study, we have used the antibody SMI-32 to examine further and to quantify the distribution of a subset of corticocortically projecting neurons in a series of long ipsilateral corticocortical pathways in comparison to short corticocortical, commissural, and limbic connections. The results demonstrate that the long association pathways interconnecting the frontal, parietal, and temporal neocortex have a high representation of neurofilament protein-enriched pyramidal neurons (45-90%), whereas short corticocortical, callosal, and limbic pathways are characterized by much lower numbers of such neurons (4-35%). These data suggest that different types of corticocortical connections have differential representation of highly specific neuronal subsets that share common neurochemical characteristics, thereby determining regional and laminar cortical patterns of morphological and molecular heterogeneity. These differences in neuronal neurochemical phenotype among corticocortical circuits may have considerable influence on cortical processing and may be directly related to the type of integrative function subserved by each cortical pathway. Finally, it is worth noting that neurofilament protein-immunoreactive neurons are dramatically affected in the course of

  13. 3D-catFISH: a system for automated quantitative three-dimensional compartmental analysis of temporal gene transcription activity imaged by fluorescence in situ hybridization.

    PubMed

    Chawla, Monica K; Lin, Gang; Olson, Kathy; Vazdarjanova, Almira; Burke, Sara N; McNaughton, Bruce L; Worley, Paul F; Guzowski, John F; Roysam, Badrinath; Barnes, Carol A

    2004-10-15

    Fluorescence in situ hybridization (FISH) of neural activity-regulated, immediate-early gene (IEG) expression provides a method of functional brain imaging with cellular resolution. This enables the identification, in one brain, of which specific principal neurons were active during each of two distinct behavioral epochs. The unprecedented potential of this differential method for large-scale analysis of functional neural circuits is limited, however, by the time-intensive nature of manual image analysis. A comprehensive software tool for processing three-dimensional, multi-spectral confocal image stacks is described which supports the automation of this analysis. Nuclei counterstained with conventional DNA dyes and FISH signals indicating the sub-cellular distribution of specific, IEG RNA species are imaged using different spectral channels. The DNA channel data are segmented into individual nuclei by a three-dimensional multi-step algorithm that corrects for depth-dependent attenuation, non-isotropic voxels, and imaging noise. Intra-nuclear and cytoplasmic FISH signals are associated spatially with the nuclear segmentation results to generate a detailed tabular/database and graphic representation. Here we present a comprehensive validation of data generated by the automated software against manual quantification by human experts on hippocampal and parietal cortical regions (96.5% concordance with multi-expert consensus). The high degree of reliability and accuracy suggests that the software will generalize well to multiple brain areas and eventually to large-scale brain analysis.

  14. Software for quantitative trait analysis.

    PubMed

    Almasy, Laura; Warren, Diane M

    2005-09-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed.

  15. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  16. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  17. Temporal Instabilities in Amblyopic Perception: A Quantitative Approach.

    PubMed

    Thiel, Aylin; Iftime, Adrian

    2016-04-01

    The purpose of this study is to quantify the temporal characteristics of spatial misperceptions in human amblyopia. Twenty-two adult participants with strabismus, strabismic, anisometropic, or mixed amblyopia were asked to describe their subjective percept of static geometrical patterns with different spatial frequencies and shapes, as seen with their non-dominant eye. We generated digital reconstructions of their perception (static images or movies) that were subsequently validated by the subjects using consecutive matching sessions. We calculated the Shannon entropy variation in time for each recorded movie, as a measure of temporal instability. Nineteen of the 22 subjects perceived temporal instabilities that can be broadly classified in two categories. We found that the average frequency of the perceived temporal instabilities is ∼1 Hz. The stimuli with higher spatial frequencies yielded more often temporally unstable perceptions with higher frequencies. We suggest that type and amount of temporal instabilities in amblyopic vision are correlated with the etiology and spatial frequency of the stimulus.

  18. Quantitative evaluation of medial temporal lobe morphology in children with febrile status epilepticus: Results of the FEBSTAT study

    PubMed Central

    McClelland, Andrew C.; Gomes, William A.; Shinnar, Shlomo; Hesdorffer, Dale C; Bagiella, Emilia; Lewis, Darrell V; Bello, Jacqueline A; Chan, Stephen; MacFall, James; Chen, Min; Pellock, John M; Nordli, Douglas R; Frank, L. Mathew; Moshe, Solomon; Shinnar, Ruth C; Sun, Shumei

    2016-01-01

    Purpose The pathogenesis of febrile status epilepticus is poorly understood, but prior studies suggest an association with temporal lobe abnormalities including hippocampal malrotation. We employ a quantitative morphometric method to assess the association between temporal lobe morphology and febrile status epilepticus. Methods Brain MR imaging was obtained in children presenting with febrile status epilepticus and control subjects as part of the FEBSTAT study. Medial temporal lobe morphologic parameters were measured manually, including distance of the hippocampus from midline, hippocampal height:width ratio, hippocampal angle, collateral sulcus angle, and width of the temporal horn. Results Temporal lobe morphologic parameters are correlated with presence of visual hippocampal malrotation; the strongest association is with left temporal horn width (p < 0.001, Adjusted OR 10.59). Multiple morphologic parameters correlate with febrile status epilepticus, encompassing both right and left sides. This association is statistically strongest in the right temporal lobe, whereas hippocampal malrotation is almost exclusively left-sided in this cohort. The association between temporal lobe measurements and febrile status epilepticus persists when the analysis is restricted to cases with visually normal imaging without hippocampal malrotation or other visually apparent abnormality. Conclusions Several component morphological features of hippocampal malrotation are independently associated with febrile status epilepticus, even when complete hippocampal malrotation is absent. Unexpectedly, this association predominantly involves the right temporal lobe. These findings suggest that a spectrum of bilateral temporal lobe anomalies are associated with febrile status epilepticus in children. Hippocampal malrotation may represent a visually apparent subset of this spectrum. PMID:27633809

  19. Data from selective harvests underestimate temporal trends in quantitative traits.

    PubMed

    Pelletier, Fanie; Festa-Bianchet, Marco; Jorgenson, Jon T

    2012-10-23

    Human harvests can select against phenotypes favoured by natural selection, and natural resource managers should evaluate possible artificial selection on wild populations. Because the required genetic data are extremely difficult to gather, however, managers typically rely on harvested animals to document temporal trends. It is usually unknown whether these data are unbiased. We explore our ability to detect a decline in horn size of bighorn sheep (Ovis canadensis) by comparing harvested males with all males in a population where evolutionary changes owing to trophy hunting were previously reported. Hunting records underestimated the temporal decline, partly because of an increasing proportion of rams that could not be harvested because their horns were smaller than the threshold set by hunting regulations. If harvests are selective, temporal trends measured from harvest records will underestimate the magnitude of changes in wild populations.

  20. Temporal Expression-based Analysis of Metabolism

    PubMed Central

    Segrè, Daniel

    2012-01-01

    Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM). We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such “history-dependent” sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques. PMID:23209390

  1. Sobering up: A Quantitative Review of Temporal Declines in Expectations

    ERIC Educational Resources Information Center

    Sweeny, Kate; Krizan, Zlatan

    2013-01-01

    Although people's outlook on the future tends to be characterized by hope and optimism, over time this outlook often becomes more dire. We review multiple theoretical accounts of this tendency to "sober up" as feedback about outcomes draws near, and we explicate factors critical to promoting these temporal declines in expectations. We then…

  2. Sobering up: A Quantitative Review of Temporal Declines in Expectations

    ERIC Educational Resources Information Center

    Sweeny, Kate; Krizan, Zlatan

    2013-01-01

    Although people's outlook on the future tends to be characterized by hope and optimism, over time this outlook often becomes more dire. We review multiple theoretical accounts of this tendency to "sober up" as feedback about outcomes draws near, and we explicate factors critical to promoting these temporal declines in expectations. We then…

  3. Anatomic and Quantitative Temporal Bone CT for Preoperative Assessment of Branchio-Oto-Renal Syndrome.

    PubMed

    Ginat, D T; Ferro, L; Gluth, M B

    2016-12-01

    We describe the temporal bone computed tomography (CT) findings of an unusual case of branchio-oto-renal syndrome with ectopic ossicles that are partially located in the middle cranial fossa. We also describe quantitative temporal bone CT assessment pertaining to cochlear implantation in the setting of anomalous cochlear anatomy associated with this syndrome.

  4. Quantitative analysis of glycated proteins.

    PubMed

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  5. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses.

  6. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma.

  7. Quantitative Evaluation of Medial Temporal Lobe Morphology in Children with Febrile Status Epilepticus: Results of the FEBSTAT Study.

    PubMed

    McClelland, A C; Gomes, W A; Shinnar, S; Hesdorffer, D C; Bagiella, E; Lewis, D V; Bello, J A; Chan, S; MacFall, J; Chen, M; Pellock, J M; Nordli, D R; Frank, L M; Moshé, S L; Shinnar, R C; Sun, S

    2016-12-01

    The pathogenesis of febrile status epilepticus is poorly understood, but prior studies have suggested an association with temporal lobe abnormalities, including hippocampal malrotation. We used a quantitative morphometric method to assess the association between temporal lobe morphology and febrile status epilepticus. Brain MR imaging was performed in children presenting with febrile status epilepticus and control subjects as part of the Consequences of Prolonged Febrile Seizures in Childhood study. Medial temporal lobe morphologic parameters were measured manually, including the distance of the hippocampus from the midline, hippocampal height:width ratio, hippocampal angle, collateral sulcus angle, and width of the temporal horn. Temporal lobe morphologic parameters were correlated with the presence of visual hippocampal malrotation; the strongest association was with left temporal horn width (P < .001; adjusted OR, 10.59). Multiple morphologic parameters correlated with febrile status epilepticus, encompassing both the right and left sides. This association was statistically strongest in the right temporal lobe, whereas hippocampal malrotation was almost exclusively left-sided in this cohort. The association between temporal lobe measurements and febrile status epilepticus persisted when the analysis was restricted to cases with visually normal imaging findings without hippocampal malrotation or other visually apparent abnormalities. Several component morphologic features of hippocampal malrotation are independently associated with febrile status epilepticus, even when complete hippocampal malrotation is absent. Unexpectedly, this association predominantly involves the right temporal lobe. These findings suggest that a spectrum of bilateral temporal lobe anomalies are associated with febrile status epilepticus in children. Hippocampal malrotation may represent a visually apparent subset of this spectrum. © 2016 by American Journal of Neuroradiology.

  8. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-05

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses.

  9. Visual Constraints for the Perception of Quantitative Depth from Temporal Interocular Unmatched Features

    PubMed Central

    Ni, Rui; Chen, Lin; Andersen, George J.

    2010-01-01

    Previous research (Brooks & Gillam, 2006) has found that temporal interocular unmatched (IOUM) features generate a perception of subjective contours and can result in a perception of quantitative depth. In the present study we examine in detail the factors important for quantitative depth perception from IOUM features. In Experiments 1 and 2 observers were shown temporal IOUM features based on three dots that disappeared behind an implicit surface. Subjects reported a perception of a subjective surface and were able to perceive qualitative depth. In Experiments 3 and 4 metrical depth was perceived when binocular disparity features were added to the display. These results suggest that quantitative depth from IOUM information is perceived when binocular matched information is present in regions adjacent to the surface. In addition, the perceived depth of the subjective surface decreased with an increase in the width of the subjective surface suggesting a limitation in the propagation of quantitative depth to surface regions where qualitative depth information is available. PMID:20493899

  10. Quantitative spatiotemporal image analysis of fluorescein angiography in age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Berger, Jeffrey W.

    1998-06-01

    Interpretation and analysis of retinal angiographic studies has been largely qualitative. Quantitative analysis of pathologic fundus features will facilitate interpretation and potentiate clinical studies where precise image metrology is vital. Fluorescein angiography studies of patients with age- related macular degeneration were digitized. Sequential temporal images were spatially-registered with polynomial warping algorithms, allowing for the construction of a three- dimensional (two spatial and one temporal) angiogram vector. Temporal profiles through spatially-registered, temporally- sequential pixels were computed. Characteristic temporal profiles for fundus background, retinal vasculature, retinal pigment epithelial atrophy, and choroidal neovascular (CNV) membranes were observed, allowing for pixel assignment and fundus feature quantitation. Segmentation and quantitation of fundus features including geographic atrophy and CNV is facilitated by spatio-temporal image analysis.

  11. Auditory spectro-temporal pattern analysis

    NASA Astrophysics Data System (ADS)

    Hall, Joseph W.

    1992-05-01

    The long-term aim of this project is a better understanding of auditory processes which use across frequency or across-ear temporal envelope difference cues to aid performance. Specific areas of investigation include comodulation masking release (CMR), the masking-level difference (MLD), temporal resolution, and the processing of amplitude and frequency modulation. The goal of the first proposed experiment is to examine the possible relation between CMR and auditory phenomena related to auditory scene analysis; the goal of the second experiment is to examine the possible relation between CMR and the MLD for narrowband noise maskers; the goal of the third experiment is to determine the extent to which across-frequency correlation of temporal envelope may influence gap detection for wideband stimuli; the goal of the fourth experiment is to determine whether masking release can be derived from cues based upon across-frequency coherence of frequency modulation; and the goal of the fifth experiment is to examine a modulation masking phenomenon related to frequency modulation. The tasks will be signal detection in masking noise, temporal gap detection, and the detection of frequency modulation. Testing will be performed in a sound-treated room, using a 3AFC adaptive procedure.

  12. Quantitative analysis of glycoprotein glycans.

    PubMed

    Orlando, Ron

    2013-01-01

    The ability to quantitatively determine changes in the N- and O-linked glycans is an essential component of comparative glycomics. Multiple strategies are available to by which this can be accomplished, including; both label free approaches and isotopic labeling strategies. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  13. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  14. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  15. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  16. Visual constraints for the perception of quantitative depth from temporal interocular unmatched features.

    PubMed

    Ni, Rui; Chen, Lin; Andersen, George J

    2010-07-21

    Previous research (Brooks & Gillam, 2006) has found that temporal interocular unmatched (IOUM) features generate a perception of subjective contours and can result in a perception of quantitative depth. In the present study we examine in detail the factors important for quantitative depth perception from IOUM features. In Experiments 1 and 2 observers were shown temporal IOUM features based on three dots that disappeared behind an implicit surface. Subjects reported a perception of a subjective surface and were able to perceive qualitative depth. In Experiments 3 and 4 metrical depth was perceived when binocular disparity features were added to the display. These results suggest that quantitative depth from IOUM information is perceived when binocular matched information is present in regions adjacent to the surface. In addition, the perceived depth of the subjective surface decreased with an increase in the width of the subjective surface suggesting a limitation in the propagation of quantitative depth to surface regions where qualitative depth information is available. Copyright 2010 Elsevier Ltd. All rights reserved.

  17. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  18. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  19. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  20. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  1. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  2. Quantitative MRI in refractory temporal lobe epilepsy: relationship with surgical outcomes

    PubMed Central

    Bonilha, Leonardo

    2015-01-01

    Medically intractable temporal lobe epilepsy (TLE) remains a serious health problem. Across treatment centers, up to 40% of patients with TLE will continue to experience persistent postoperative seizures at 2-year follow-up. It is unknown why such a large number of patients continue to experience seizures despite being suitable candidates for resective surgery. Preoperative quantitative MRI techniques may provide useful information on why some patients continue to experience disabling seizures, and may have the potential to develop prognostic markers of surgical outcome. In this article, we provide an overview of how quantitative MRI morphometric and diffusion tensor imaging (DTI) data have improved the understanding of brain structural alterations in patients with refractory TLE. We subsequently review the studies that have applied quantitative structural imaging techniques to identify the neuroanatomical factors that are most strongly related to a poor postoperative prognosis. In summary, quantitative imaging studies strongly suggest that TLE is a disorder affecting a network of neurobiological systems, characterized by multiple and inter-related limbic and extra-limbic network abnormalities. The relationship between brain alterations and postoperative outcome are less consistent, but there is emerging evidence suggesting that seizures are less likely to remit with surgery when presurgical abnormalities are observed in the connectivity supporting brain regions serving as network nodes located outside the resected temporal lobe. Future work, possibly harnessing the potential from multimodal imaging approaches, may further elucidate the etiology of persistent postoperative seizures in patients with refractory TLE. Furthermore, quantitative imaging techniques may be explored to provide individualized measures of postoperative seizure freedom outcome. PMID:25853080

  3. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  4. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  5. Falcon: A Temporal Visual Analysis System

    SciTech Connect

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  6. Temporal analysis of multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Richardson, A. J.; Wiegand, C. L.; Torline, R. J.

    1973-01-01

    Multispectral scanner reflectance data were sampled for bare soil, cotton, sorghum, corn, and citrus at four dates during a growing season (April, May, June, and July 1969) to develop a time-dependent signature for crop and soil discrimination. Discrimination tests were conducted for single-date and multidate formats using training and test data sets. For classifications containing several crops, the multidate or temporal approach improved discrimination compared with the single-date approach. The multidate approach also preserved recognition accuracy better in going from training fields to test fields than the single-date analysis. The spectral distinctiveness of bare soil versus vegetation resulted in essentially equal discrimination using single-date versus multidate data for those two categories.

  7. Temporal analysis of multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Richardson, A. J.; Wiegand, C. L.; Torline, R. J.

    1973-01-01

    Multispectral scanner reflectance data were sampled for bare soil, cotton, sorghum, corn, and citrus at four dates during a growing season (April, May, June, and July 1969) to develop a time-dependent signature for crop and soil discrimination. Discrimination tests were conducted for single-date and multidate formats using training and test data sets. For classifications containing several crops, the multidate or temporal approach improved discrimination compared with the single-date approach. The multidate approach also preserved recognition accuracy better in going from training fields to test fields than the single-date analysis. The spectral distinctiveness of bare soil versus vegetation resulted in essentially equal discrimination using single-date versus multidate data for those two categories.

  8. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics.

  9. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  10. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  11. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  12. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  13. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  14. A quantitative approach to scar analysis.

    PubMed

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.

  15. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  16. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  17. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  18. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  19. An in Situ Intracellular Self-Assembly Strategy for Quantitatively and Temporally Monitoring Autophagy.

    PubMed

    Lin, Yao-Xin; Qiao, Sheng-Lin; Wang, Yi; Zhang, Ruo-Xin; An, Hong-Wei; Ma, Yang; Rajapaksha, R P Yeshan J; Qiao, Zeng-Ying; Wang, Lei; Wang, Hao

    2017-02-28

    Autophagy plays a crucial role in the metabolic process. So far, conventional methods are incapable of rapid, precise, and real-time monitoring of autophagy in living objects. Herein, we describe an in situ intracellular self-assembly strategy for quantitative and temporal determination of autophagy in living objectives. The intelligent building blocks (DPBP) are composed by a bulky dendrimer as a carrier, a bis(pyrene) derivative (BP) as a signal molecule, and a peptide linker as a responsive unit that can be cleaved by an autophagy-specific enzyme, i.e., ATG4B. DPBP maintains the quenched fluorescence with monomeric BP. However, the responsive peptide is specifically tailored upon activation of autophagy, resulting in self-aggregation of BP residues which emit a 30-fold enhanced fluorescence. By measuring the intensity of fluorescent signal, we are able to quantitatively evaluate the autophagic level. In comparison with traditional techniques, such as TEM, Western blot, and GFP-LC3, the reliability and accuracy of this method are finally validated. We believe this in situ intracellular self-assembly strategy provides a rapid, effective, real-time, and quantitative method for monitoring autophagy in living objects, and it will be a useful tool for autophagy-related fundamental and clinical research.

  20. Forty years of temporal analysis of products

    DOE PAGES

    Morgan, K.; Maguire, N.; Fushimi, R.; ...

    2017-05-16

    Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less

  1. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  2. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  3. High-energy PIXE: quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Opitz-Coutureau, J.; Campbell, J. L.; Maxwell, J. A.; Hopman, T.

    2004-06-01

    In recent years, high-energy PIXE was applied successfully for qualitative analysis on art and archaeological objects, e.g. coins, bronzes, sculptures, brooches. However, in the absence of software for quantitative analysis the full benefit inherent in the PIXE technique was not obtained. For example, a bronze could easily be distinguished from a brass, but the concentrations could not be rigorously compared within a set of bronzes. In this paper, the first quantitative analysis by high-energy PIXE is presented. The Guelph PIXE Software Package GUPIX has been extended to proton energies up to 100 MeV, so that high-energy PIXE spectra can be evaluated and concentrations derived. Measurements on metal and alloy standards at two different proton energies have been performed and the obtained compositions were compared to the certified values. The results will be presented and deviations discussed.

  4. Quantitative analysis of colony morphology in yeast.

    PubMed

    Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimée M

    2014-01-01

    Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism's virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; http://yimaa.cs.tut.fi) that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development.

  5. Quantitative and Temporal Requirements Revealed for Zap-70 Catalytic Activity During T Cell Development

    PubMed Central

    Au-Yeung, Byron B.; Melichar, Heather J.; Ross, Jenny O.; Cheng, Debra A.; Zikherman, Julie; Shokat, Kevan M.; Robey, Ellen A.; Weiss, Arthur

    2014-01-01

    The catalytic activity of Zap-70 is crucial for T cell receptor (TCR) signaling, but the quantitative and temporal requirements for its function in thymocyte development are not known. Using a chemical-genetic system to selectively and reversibly inhibit Zap-70 catalytic activity in a model of synchronized thymic selection, we showed that CD4+CD8+ thymocytes integrate multiple, transient, Zap-70-dependent signals over more than 36 h to reach a cumulative threshold for positive selection, whereas one hour of signaling was sufficient for negative selection. Titration of Zap-70 activity resulted in graded reductions in positive and negative selection but did not decrease the cumulative TCR signals integrated by positively selected OT-I cells, revealing heterogeneity, even among CD4+CD8+ thymocytes expressing identical TCRs undergoing positive selection. PMID:24908390

  6. Phase noise optimization in temporal phase-shifting digital holography with partial coherence light sources and its application in quantitative cell imaging.

    PubMed

    Remmersmann, Christian; Stürwald, Stephan; Kemper, Björn; Langehanenberg, Patrik; von Bally, Gert

    2009-03-10

    In temporal phase-shifting-based digital holographic microscopy, high-resolution phase contrast imaging requires optimized conditions for hologram recording and phase retrieval. To optimize the phase resolution, for the example of a variable three-step algorithm, a theoretical analysis on statistical errors, digitalization errors, uncorrelated errors, and errors due to a misaligned temporal phase shift is carried out. In a second step the theoretically predicted results are compared to the measured phase noise obtained from comparative experimental investigations with several coherent and partially coherent light sources. Finally, the applicability for noise reduction is demonstrated by quantitative phase contrast imaging of pancreas tumor cells.

  7. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  8. Spatial and temporal epidemiological analysis in the Big Data era.

    PubMed

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  9. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  10. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  11. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  12. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  13. Temporal Mixture Analysis of Hypertemporal Antarctic Sea Ice Data in the Sense of Machine Learning

    NASA Astrophysics Data System (ADS)

    Chi, J.; Kim, H. C.

    2015-12-01

    Hypertemporal data, or time series acquired at high temporal frequencies, are often used to determine seasonal characteristics of environmental phenomena such as sea ice concentration. However, it is difficult to analyze long-term hypertemporal remote sensing data over extensive areas without prior information. Most pixels of hypertemporal data are highly mixed and contain several distinct temporal signals that represent seasonal characteristics of substances. This study performed temporal mixture analysis, which is algebraically similar to spectral mixture analysis, but occurs in the time domain instead of the spectral domain. Temporal mixture analysis was used to investigate the temporal characteristics of Antarctic sea ice. Two general steps were used to address mixing problems: 1) finding temporally unique signatures of pure components, which are referred to as temporal endmembers, and 2) unmixing each pixel in the time series data as a linear combination of the endmember fractional abundances. Because endmember selection is critical to the success of both spectral and temporal mixture analysis, it is important to select proper endmembers from large quantities of hypertemporal data. A machine learning algorithm was introduced to successfully identify endmembers without prior knowledge. A fully linear mixing model was then implemented in an attempt to produce more robust and physically meaningful abundance estimates. Experiments that quantitatively and qualitatively evaluated the proposed approaches were conducted. A temporal mixture analysis of high-temporal-dimensional data provides a unique summary of long-term Antarctic sea ice and noise-whitened reconstruction images via inverse processing. Further, comparisons of regional sea-ice fractions provide a better understanding of the overall Antarctic sea ice changes.

  14. Influence analysis in quantitative trait loci detection

    PubMed Central

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-01-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods—the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. PMID:24740424

  15. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  17. Improved accuracy of quantitative parameter estimates in dynamic contrast-enhanced CT study with low temporal resolution

    SciTech Connect

    Kim, Sun Mo; Jaffray, David A.

    2016-01-15

    Purpose: A previously proposed method to reduce radiation dose to patient in dynamic contrast-enhanced (DCE) CT is enhanced by principal component analysis (PCA) filtering which improves the signal-to-noise ratio (SNR) of time-concentration curves in the DCE-CT study. The efficacy of the combined method to maintain the accuracy of kinetic parameter estimates at low temporal resolution is investigated with pixel-by-pixel kinetic analysis of DCE-CT data. Methods: The method is based on DCE-CT scanning performed with low temporal resolution to reduce the radiation dose to the patient. The arterial input function (AIF) with high temporal resolution can be generated with a coarsely sampled AIF through a previously published method of AIF estimation. To increase the SNR of time-concentration curves (tissue curves), first, a region-of-interest is segmented into squares composed of 3 × 3 pixels in size. Subsequently, the PCA filtering combined with a fraction of residual information criterion is applied to all the segmented squares for further improvement of their SNRs. The proposed method was applied to each DCE-CT data set of a cohort of 14 patients at varying levels of down-sampling. The kinetic analyses using the modified Tofts’ model and singular value decomposition method, then, were carried out for each of the down-sampling schemes between the intervals from 2 to 15 s. The results were compared with analyses done with the measured data in high temporal resolution (i.e., original scanning frequency) as the reference. Results: The patients’ AIFs were estimated to high accuracy based on the 11 orthonormal bases of arterial impulse responses established in the previous paper. In addition, noise in the images was effectively reduced by using five principal components of the tissue curves for filtering. Kinetic analyses using the proposed method showed superior results compared to those with down-sampling alone; they were able to maintain the accuracy in the

  18. Quantitative petrostructure analysis. Technical summary report

    SciTech Connect

    Warren, N.

    1980-09-01

    The establishment of quantitative techniques would lead to the development of predictive tools which would be of obvious importance in applied geophysics and engineering. In rock physics, it would help establish laws for averaging the effects of finite densities of real cracks and pores. It would also help in elucidating the relation between observed complex crack structures and various models for the mechanical properties of single cracks. The petrostructure study is addressed to this problem. The purpose of the effort is to quantitatively characterize the mineral and crack texture of granitic rock samples. The rock structures are to be characterized in such a way that the results can be used (1) to constrain the modelling of the effect of cracks on the physical properties of rocks, and (2) to test the possibility of establishing quantitative and predictive relations between petrographic observables and whole rock properties. Statistical techniques are being developed and being applied to the problem of parameterizing complex texture and crack patterns of rock, and of measuring correlation of these parameters to other measurable variables. The study is an application in factor analysis.

  19. Temporal quantitative phosphoproteomics of ADP stimulation reveals novel central nodes in platelet activation and inhibition

    PubMed Central

    Beck, Florian; Geiger, Jörg; Gambaryan, Stepan; Solari, Fiorella A.; Dell’Aica, Margherita; Loroch, Stefan; Mattheij, Nadine J.; Mindukshev, Igor; Pötz, Oliver; Jurk, Kerstin; Burkhart, Julia M.; Fufezan, Christian; Heemskerk, Johan W. M.; Walter, Ulrich

    2017-01-01

    Adenosine diphosphate (ADP) enhances platelet activation by virtually any other stimulant to complete aggregation. It binds specifically to the G-protein–coupled membrane receptors P2Y1 and P2Y12, stimulating intracellular signaling cascades, leading to integrin αIIbβ3 activation, a process antagonized by endothelial prostacyclin. P2Y12 inhibitors are among the most successful antiplatelet drugs, however, show remarkable variability in efficacy. We reasoned whether a more detailed molecular understanding of ADP-induced protein phosphorylation could identify (1) critical hubs in platelet signaling toward aggregation and (2) novel molecular targets for antiplatelet treatment strategies. We applied quantitative temporal phosphoproteomics to study ADP-mediated signaling at unprecedented molecular resolution. Furthermore, to mimic the antagonistic efficacy of endothelial-derived prostacyclin, we determined how Iloprost reverses ADP-mediated signaling events. We provide temporal profiles of 4797 phosphopeptides, 608 of which showed significant regulation. Regulated proteins are implicated in well-known activating functions such as degranulation and cytoskeletal reorganization, but also in less well-understood pathways, involving ubiquitin ligases and GTPase exchange factors/GTPase-activating proteins (GEF/GAP). Our data demonstrate that ADP-triggered phosphorylation occurs predominantly within the first 10 seconds, with many short rather than sustained changes. For a set of phosphorylation sites (eg, PDE3ASer312, CALDAG-GEFISer587, ENSASer109), we demonstrate an inverse regulation by ADP and Iloprost, suggesting that these are central modulators of platelet homeostasis. This study demonstrates an extensive spectrum of human platelet protein phosphorylation in response to ADP and Iloprost, which inversely overlap and represent major activating and inhibitory pathways. PMID:28060719

  20. Quantitative textural analysis of phenocryst zoning patterns

    NASA Astrophysics Data System (ADS)

    Niespolo, E.; Andrews, B. J.

    2011-12-01

    The textural complexity of phenocrysts has made quantitative analysis of large populations of crystals a challenging study. Because each phenocryst expresses a unique localized event in the volcanic interior, no single crystal necessarily records the complete pre-eruptive history of the magmatic system as a whole. Synthesizing the textural and compositional records of many crystals, however, should provide a more complete understanding of conditions prior to eruption. In this research, we present new techniques for quantitative analysis of individual crystals and across populations of crystals. We apply those techniques to back-scattered electron images of complexly zoned plagioclase from El Chichón volcano, Mexico. Analysis begins with Gaussian filtering to remove noise from the images and create more qualitatively distinct zoning patterns. Because pixel intensity is directly correlated with Anorthite content, compositional anisotropy is then calculated throughout each image by determining the distance from a grid point at which variation in pixel intensity exceeds a pre-determined standard deviation; both regular and adaptive grid spacings are used, and length scales are calculated in 8 directions. The resulting textural maps are analogous to a vector field and quantify 2-dimensional variation in texture. With both types of grid spacing, changes in magnitude and orientation of textural anisotropy and length scale indicate different crystal zones. The adaptive grid spacing, however, describes non-uniform textural variation more completely and has a higher measurement density in regions of high-frequency variation. In general, textural regions commonly described as clean or smooth show longer length scales and aligned anisotropies, whereas shorter length scales with variable anisotropies identify areas commonly described as patchy, dusty, or rough. The comparison and correlation of textural and compositional zoning help determine how different crystals record the

  1. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  2. Quantitative analysis of non-Hodgkin's lymphoma.

    PubMed Central

    Abbott, C R; Blewitt, R W; Bird, C C

    1982-01-01

    A preliminary attempt has been made to characterise a small series of non-Hodgkin's lymphomas (NHL) by morphometric means using the Quantimet 720 Kontron MOP/AMO3 image analysis systems. In most cases it was found that the distribution of nuclear area and correlation between mean nuclear area and frequency per unit field, corresponded closely with tumour classification determined by light microscopy. These results suggest that it may be possible to devise an objective and reproducible grading system for NHL using quantitative morphometric techniques. PMID:7040479

  3. Spatially, Temporally, and Quantitatively Controlled Delivery of Broad Range of Molecules into Selected Cells through Plasmonic Nanotubes.

    PubMed

    Messina, Gabriele C; Dipalo, Michele; La Rocca, Rosanna; Zilio, Pierfrancesco; Caprettini, Valeria; Proietti Zaccaria, Remo; Toma, Andrea; Tantussi, Francesco; Berdondini, Luca; De Angelis, Francesco

    2015-11-25

    A Universal plasmonic/microfluidic platform for spatial and temporal controlled intracellular delivery is described. The system can inject/transfect the desired amount of molecules with an efficacy close to 100%. Moreover, it is highly scalable from single cells to large ensembles without administering the molecules to an extracellular bath. The latter enables quantitative control over the amount of injected molecules.

  4. An analysis of solar-cycle temporal relationships among activity indicators

    NASA Astrophysics Data System (ADS)

    Bachmann, K. T.; Maymani, H.; Nautiyal, K.; te Velde, V.

    2004-01-01

    Differences in the time development of solar activity indices are an important clue in the search for physical processes responsible for changing solar emission at various wavelengths. In this paper we describe our investigation of temporal relationships among two space-based indices, Lyman-α 121.6 nm emission (Lα) and the Mg II 280 nm core-to-wing ratio, and four ground-based indices - the 10.7 cm flux (F10), the He I 1083 nm equivalent width, the Ca II K 393.4 nm emission index, and the International Sunspot Number (ISN). We provide scatterplots of index pairs passed through a 2-year Gaussian filter during each available solar cycle, and we approximate the temporal relationships quantitatively as overall temporal offsets with uncertainties. We reconcile our findings with qualitative ideas concerning the variation of solar emissions with solar activity. Since the F10 and ISN time series are longer than four complete solar cycles, we are able to evaluate the reproducibility of temporal offsets over multiple solar cycles. The chief motivation for our work is to improve solar indicator analysis by providing a method of seeing and analyzing temporal relationships clearly and easily. We believe that future physical models of magnetic activity and spectral emissions in the solar chromosphere and transition region may make quantitative predictions of temporal relationships among full-disk solar indices for comparison with analyses such as ours.

  5. Spectral and Temporal Laser Fluorescence Analysis Such as for Natural Aquatic Environments

    NASA Technical Reports Server (NTRS)

    Chekalyuk, Alexander (Inventor)

    2015-01-01

    An Advanced Laser Fluorometer (ALF) can combine spectrally and temporally resolved measurements of laser-stimulated emission (LSE) for characterization of dissolved and particulate matter, including fluorescence constituents, in liquids. Spectral deconvolution (SDC) analysis of LSE spectral measurements can accurately retrieve information about individual fluorescent bands, such as can be attributed to chlorophyll-a (Chl-a), phycobiliprotein (PBP) pigments, or chromophoric dissolved organic matter (CDOM), among others. Improved physiological assessments of photosynthesizing organisms can use SDC analysis and temporal LSE measurements to assess variable fluorescence corrected for SDC-retrieved background fluorescence. Fluorescence assessments of Chl-a concentration based on LSE spectral measurements can be improved using photo-physiological information from temporal measurements. Quantitative assessments of PBP pigments, CDOM, and other fluorescent constituents, as well as basic structural characterizations of photosynthesizing populations, can be performed using SDC analysis of LSE spectral measurements.

  6. Quantitative analysis of retinal changes in hypertension

    NASA Astrophysics Data System (ADS)

    Giansanti, Roberto; Boemi, Massimo; Fumelli, Paolo; Passerini, Giorgio; Zingaretti, Primo

    1995-05-01

    Arterial hypertension is a high prevalence disease in Western countries and it is associated with increased risk for cardiovascular accidents. Retinal vessel changes are common findings in patients suffering from long-standing hypertensive disease. Morphological evaluations of the fundus oculi represent a fundamental tool for the clinical approach to the patient with hypertension. A qualitative analysis of the retinal lesions is usually performed and this implies severe limitations both in the classification of the different degrees of the pathology and in the follow-up of the disease. A diagnostic system based on a quantitative analysis of the retinal changes could overcome these problems. Our computerized approach was intended for this scope. The paper concentrates on the results and the implications of a computerized approach to the automatic extraction of numerical indexes describing morphological details of the fundus oculi. A previously developed image processing and recognition system, documented elsewhere and briefly described here, was successfully tested in pre-clinical experiments and applied in the evaluation of normal as well as of pathological fundus. The software system was developed to extract indexes such as caliber and path of vessels, local tortuosity of arteries and arterioles, positions and angles of crossings between two vessels. The reliability of the results, justified by their low variability, makes feasible the standardization of quantitative parameters to be used both in the diagnosis and in the prognosis of hypertension, and also allows prospective studies based upon them.

  7. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  8. Cotton Defense Induction Patterns Under Spatially, Temporally and Quantitatively Varying Herbivory Levels.

    PubMed

    Eisenring, Michael; Meissle, Michael; Hagenbucher, Steffen; Naranjo, Steven E; Wettstein, Felix; Romeis, Jörg

    2017-01-01

    In its defense against herbivores, cotton (Gossypium sp.) relies in part on the production of a set of inducible, non-volatile terpenoids. Under uniform damage levels, in planta allocation of induced cotton terpenoids has been found to be highest in youngest leaves, supporting assumptions of the optimal defense theory (ODT) which predicts that plants allocate defense compounds to tissues depending on their value and the likelihood of herbivore attack. However, our knowledge is limited on how varying, and thus more realistic, damage levels might affect cotton defense organization. We hypothesized that the allocation of terpenoids and densities of terpenoid-storing glands in leaves aligns with assumptions of the ODT, even when plants are subjected to temporally, spatially and quantitatively varying caterpillar (Heliothis virescens) damage. As expected, cotton plants allocated most of their defenses to their youngest leaves regardless of damage location. However, defense induction in older leaves varied with damage location. For at least 14 days after damage treatments ended, plants reallocated defense resources from previously young leaves to newly developed leaves. Furthermore, we observed a positive hyperbolic relationship between leaf damage area and both terpenoid concentrations and gland densities, indicating that cotton plants can fine-tune defense allocation. Although it appears that factors like vascular constraints and chemical properties of individual defense compounds can affect defense levels, our results overall demonstrate that induced defense organization of cotton subjected to varying damage treatments is in alignment with key assumptions of the ODT.

  9. Cotton Defense Induction Patterns Under Spatially, Temporally and Quantitatively Varying Herbivory Levels

    PubMed Central

    Eisenring, Michael; Meissle, Michael; Hagenbucher, Steffen; Naranjo, Steven E.; Wettstein, Felix; Romeis, Jörg

    2017-01-01

    In its defense against herbivores, cotton (Gossypium sp.) relies in part on the production of a set of inducible, non-volatile terpenoids. Under uniform damage levels, in planta allocation of induced cotton terpenoids has been found to be highest in youngest leaves, supporting assumptions of the optimal defense theory (ODT) which predicts that plants allocate defense compounds to tissues depending on their value and the likelihood of herbivore attack. However, our knowledge is limited on how varying, and thus more realistic, damage levels might affect cotton defense organization. We hypothesized that the allocation of terpenoids and densities of terpenoid-storing glands in leaves aligns with assumptions of the ODT, even when plants are subjected to temporally, spatially and quantitatively varying caterpillar (Heliothis virescens) damage. As expected, cotton plants allocated most of their defenses to their youngest leaves regardless of damage location. However, defense induction in older leaves varied with damage location. For at least 14 days after damage treatments ended, plants reallocated defense resources from previously young leaves to newly developed leaves. Furthermore, we observed a positive hyperbolic relationship between leaf damage area and both terpenoid concentrations and gland densities, indicating that cotton plants can fine-tune defense allocation. Although it appears that factors like vascular constraints and chemical properties of individual defense compounds can affect defense levels, our results overall demonstrate that induced defense organization of cotton subjected to varying damage treatments is in alignment with key assumptions of the ODT. PMID:28270830

  10. Spatial, Temporal, and Quantitative Manipulation of Intracellular Hydrogen Peroxide in Cultured Cells

    PubMed Central

    Alim, Ishraq; Haskew-Layton, Renee E.; Aleyasin, Hossein; Guo, Hengchang; Ratan, Rajiv R.

    2015-01-01

    Hydrogen peroxide (H2O2) is produced endogenously in a number of cellular compartments, including the mitochondria, the endoplasmic reticulum, peroxisomes, and at the plasma membrane, and can play divergent roles as a second messenger or a pathological toxin. It is assumed that the tuned production of H2O2 within neuronal and non-neuronal cells regulates a discreet balance between survival and death. However, a major challenge in understanding the physiological versus pathological role of H2O2 in cells has been the lack of validated methods that can spatially, temporally, and quantitatively modulate H2O2 production. A promising means of regulating endogenous H2O2 is through the expression of peroxide-producing enzyme D-amino acid oxidase (DAAO from Rhodotorula gracilis lacking a peroxisomal targeting sequence). Using viral vectors to express DAAO in distinct cell types and using targeting sequences to target DAAO to distinct subcellular sites, we can manipulate H2O2 production by applying the substrate D-alanine or permeable analogs of D-alanine. In this chapter, we describe the use of DAAO to produce H2O2 in culture models and the real-time visual validation of this technique using two-photon microscopy and chemoselective fluorescent probes. PMID:25416362

  11. Spatio-temporal analysis of irregular vocal fold oscillations: Biphonation due to desynchronization of spatial modes

    NASA Astrophysics Data System (ADS)

    Neubauer, Jürgen; Mergell, Patrick; Eysholdt, Ulrich; Herzel, Hanspeter

    2001-12-01

    This report is on direct observation and modal analysis of irregular spatio-temporal vibration patterns of vocal fold pathologies in vivo. The observed oscillation patterns are described quantitatively with multiline kymograms, spectral analysis, and spatio-temporal plots. The complex spatio-temporal vibration patterns are decomposed by empirical orthogonal functions into independent vibratory modes. It is shown quantitatively that biphonation can be induced either by left-right asymmetry or by desynchronized anterior-posterior vibratory modes, and the term ``AP (anterior-posterior) biphonation'' is introduced. The presented phonation examples show that for normal phonation the first two modes sufficiently explain the glottal dynamics. The spatio-temporal oscillation pattern associated with biphonation due to left-right asymmetry can be explained by the first three modes. Higher-order modes are required to describe the pattern for biphonation induced by anterior-posterior vibrations. Spatial irregularity is quantified by an entropy measure, which is significantly higher for irregular phonation than for normal phonation. Two asymmetry measures are introduced: the left-right asymmetry and the anterior-posterior asymmetry, as the ratios of the fundamental frequencies of left and right vocal fold and of anterior-posterior modes, respectively. These quantities clearly differentiate between left-right biphonation and anterior-posterior biphonation. This paper proposes methods to analyze quantitatively irregular vocal fold contour patterns in vivo and complements previous findings of desynchronization of vibration modes in computer modes and in in vitro experiments.

  12. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  13. The method of quantitative automatic metallographic analysis

    NASA Astrophysics Data System (ADS)

    Martyushev, N. V.; Skeeba, V. Yu

    2017-01-01

    A brief analysis of the existing softwares for computer processing of microstructure photographs is presented. The descriptions of the the software package developed by the author are demonstrated. This software product is intended for quantitative metallographic analysis of digital photographs of the microstructure of materials. It allows calculating the volume fraction and the average size of particles of the structure by several hundred secants (depending on the photographs resolution) in one vision field. Besides, a special module is built in the software allowing assessing the degree of deviation of the shape of different particles and impurities from the spherical one. The article presents the main algorithms, used during the creation of the software product, and formulae according to which the software calculates the parameters of the microstructure. It is shown that the reliability of calculations depends on the quality of preparation of the microstructure.

  14. Quantitative laryngeal electromyography: turns and amplitude analysis.

    PubMed

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  15. Topological Data Analysis of High-Resolution Temporal Rainfall

    NASA Astrophysics Data System (ADS)

    Carsteanu, Alin Andrei; Fernández Méndez, Félix; Vásquez Aguilar, Raciel

    2017-04-01

    This study applies topological data analysis (TDA) to the state space representations of high-resolution temporal rainfall intensity data from Iowa City (IIHR, U of Iowa). Using a sufficient embedding dimension, topological properties of the underlying manifold are depicted.

  16. Automated quantitative image analysis of nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  17. Forecasting Antarctic Sea Ice Concentrations Using Results of Temporal Mixture Analysis

    NASA Astrophysics Data System (ADS)

    Chi, Junhwa; Kim, Hyun-Cheol

    2016-06-01

    Sea ice concentration (SIC) data acquired by passive microwave sensors at daily temporal frequencies over extended areas provide seasonal characteristics of sea ice dynamics and play a key role as an indicator of global climate trends; however, it is typically challenging to study long-term time series. Of the various advanced remote sensing techniques that address this issue, temporal mixture analysis (TMA) methods are often used to investigate the temporal characteristics of environmental factors, including SICs in the case of the present study. This study aims to forecast daily SICs for one year using a combination of TMA and time series modeling in two stages. First, we identify temporally meaningful sea ice signatures, referred to as temporal endmembers, using machine learning algorithms, and then we decompose each pixel into a linear combination of temporal endmembers. Using these corresponding fractional abundances of endmembers, we apply a autoregressive model that generally fits all Antarctic SIC data for 1979 to 2013 to forecast SIC values for 2014. We compare our results using the proposed approach based on daily SIC data reconstructed from real fractional abundances derived from a pixel unmixing method and temporal endmember signatures. The proposed method successfully forecasts new fractional abundance values, and the resulting images are qualitatively and quantitatively similar to the reference data.

  18. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  19. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  20. Quantitative Analysis of Tremors in Welders

    PubMed Central

    Sanchez-Ramos, Juan; Reimer, Dacy; Zesiewicz, Theresa; Sullivan, Kelly; Nausieda, Paul A.

    2011-01-01

    Background: Workers chronically exposed to manganese in welding fumes may develop an extra-pyramidal syndrome with postural and action tremors. Objectives: To determine the utility of tremor analysis in distinguishing tremors among workers exposed to welding fumes, patients with Idiopathic Parkinson’s Disease (IPD) and Essential Tremor (ET). Methods: Retrospective study of recorded tremor in subjects from academic Movement Disorders Clinics and Welders. Quantitative tremor analysis was performed and associated with clinical status. Results: Postural tremor intensity was increased in Welders and ET and was associated with visibly greater amplitude of tremor with arms extended. Mean center frequencies (Cf) of welders and patients with ET were significantly higher than the mean Cf of PD subjects. Although both the welders and the ET group exhibited a higher Cf with arms extended, welders could be distinguished from the ET subjects by a significantly lower Cf of the rest tremor than that measured in ET subjects. Conclusions: In the context of an appropriate exposure history and neurological examination, tremor analysis may be useful in the diagnosis of manganese-related extra-pyramidal manifestations. PMID:21655131

  1. Nonlinear dynamics and quantitative EEG analysis.

    PubMed

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  2. Temporal shape analysis via the spectral signature.

    PubMed

    Bernardis, Elena; Konukoglu, Ender; Ou, Yangming; Metaxas, Dimitris N; Desjardins, Benoit; Pohl, Kilian M

    2012-01-01

    In this paper, we adapt spectral signatures for capturing morphological changes over time. Advanced techniques for capturing temporal shape changes frequently rely on first registering the sequence of shapes and then analyzing the corresponding set of high dimensional deformation maps. Instead, we propose a simple encoding motivated by the observation that small shape deformations lead to minor refinements in the spectral signature composed of the eigenvalues of the Laplace operator. The proposed encoding does not require registration, since spectral signatures are invariant to pose changes. We apply our representation to the shapes of the ventricles extracted from 22 cine MR scans of healthy controls and Tetralogy of Fallot patients. We then measure the accuracy score of our encoding by training a linear classifier, which outperforms the same classifier based on volumetric measurements.

  3. Analysis of brain patterns using temporal measures

    DOEpatents

    Georgopoulos, Apostolos

    2015-08-11

    A set of brain data representing a time series of neurophysiologic activity acquired by spatially distributed sensors arranged to detect neural signaling of a brain (such as by the use of magnetoencephalography) is obtained. The set of brain data is processed to obtain a dynamic brain model based on a set of statistically-independent temporal measures, such as partial cross correlations, among groupings of different time series within the set of brain data. The dynamic brain model represents interactions between neural populations of the brain occurring close in time, such as with zero lag, for example. The dynamic brain model can be analyzed to obtain the neurophysiologic assessment of the brain. Data processing techniques may be used to assess structural or neurochemical brain pathologies.

  4. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  5. Temporal variations of skin pigmentation in C57BL/6 mice affect optical bioluminescence quantitation.

    PubMed

    Curtis, Allison; Calabro, Katherine; Galarneau, Jean-Rene; Bigio, Irving J; Krucker, Thomas

    2011-12-01

    Depilation-induced skin pigmentation in C57Bl/6 mice is a known occurrence, and presents a unique problem for quantitative optical imaging of small animals, especially for bioluminescence. The work reported here quantitatively investigated the optical attenuation of bioluminescent light due to melanin pigmentation in the skin of transgenic C57Bl/6 mice, modified such that luciferase expression is under the transcription control of a physiologically and pharmacologically inducible gene. Both in vivo and ex vivo experiments were performed to track bioluminescence signal attenuation through different stages of the mouse hair growth cycle. Simultaneous reflectance measurements were collected in vivo to estimate melanin levels. Biological variability of skin pigmentation was found to dramatically affect collected bioluminescent signal emerging through the skin of the mice. When compared to signal through skin with no pigmentation, the signal through highly pigmented skin was attenuated an average of 90%. Positive correlation was found between reflectance measurements and bioluminescence signal loss. A correction scheme is proposed based on this correlation, but signal variation due to non-melanin scattering and absorption sources introduce significant errors. Advanced spectral reflectance analysis will be necessary to develop a more reliable correction method in the future. Skin pigmentation is a significant variable in bioluminescent imaging, and should be considered in experimental design and implementation for longitudinal studies, and especially when sensitivity to small signal changes, or differences among animals, is required.

  6. Quantitative Analysis of Triple Mutant Genetic Interactions

    PubMed Central

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E.; Wu, Qiuqin; Haber, James E.; Krogan, Nevan J.

    2014-01-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven effective for characterizing cellular functions but can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed Triple Mutant Analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, that is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principle actors are deleted. TMA has also uncovered double mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete, and measures interactions for up to 30 double mutants against a library of 1536 single mutants. PMID:25010907

  7. Seniors' online communities: a quantitative content analysis.

    PubMed

    Nimrod, Galit

    2010-06-01

    To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. There was a constant increase in the daily activity level during the research period. Content analysis identified 13 main subjects discussed in the communities, including (in descending order) "Fun on line," "Retirement," "Family," "Health," "Work and Study," "Recreation" "Finance," "Religion and Spirituality," "Technology," "Aging," "Civic and Social," "Shopping," and "Travels." The overall tone was somewhat more positive than negative. The findings suggest that the utilities of Information and Communications Technologies for older adults that were identified in previous research are valid for seniors' online communities as well. However, the findings suggest several other possible benefits, which may be available only to online communities. The communities may provide social support, contribute to self-preservation, and serve as an opportunity for self-discovery and growth. Because they offer both leisure activity and an expanded social network, it is suggested that active participation in the communities may contribute to the well-being of older adults. Directions for future research and applied implications are further discussed.

  8. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Combining qualitative and quantitative spatial and temporal information in a hierarchical structure: Approximate reasoning for plan execution monitoring

    NASA Technical Reports Server (NTRS)

    Hoebel, Louis J.

    1993-01-01

    The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.

  10. Network Analysis Using Spatio-Temporal Patterns

    NASA Astrophysics Data System (ADS)

    Miranda, Gisele H. B.; Machicao, Jeaneth; Bruno, Odemir M.

    2016-08-01

    Different network models have been proposed along the last years inspired by real-world topologies. The characterization of these models implies the understanding of the underlying network phenomena, which accounts structural and dynamic properties. Several mathematical tools can be employed to characterize such properties as Cellular Automata (CA), which can be defined as dynamical systems of discrete nature composed by spatially distributed units governed by deterministic rules. In this paper, we proposed a method based on the modeling of one specific CA over distinct network topologies in order to perform the classification of the network model. The proposed methodology consists in the modeling of a binary totalistic CA over a network. The transition function that governs each CA cell is based on the density of living neighbors. Secondly, the distribution of the Shannon entropy is obtained from the evolved spatio-temporal pattern of the referred CA and used as a network descriptor. The experiments were performed using a dataset composed of four different types of networks: random, small-world, scale-free and geographical. We also used cross-validation for training purposes. We evaluated the accuracy of classification as a function of the initial number of living neighbors, and, also, as a function of a threshold parameter related to the density of living neighbors. The results show high accuracy values in distinguishing among the network models which demonstrates the feasibility of the proposed method.

  11. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    PubMed

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I(2) = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  12. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  13. Quantitative color analysis for capillaroscopy image segmentation.

    PubMed

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  14. A spatial-temporal covariance model for rainfall analysis

    NASA Astrophysics Data System (ADS)

    Li, Sha; Shu, Hong; Xu, Zhengquan

    2009-10-01

    Many environmental phenomena are regarded as realizations of random functions which possess both spatial and temporal characteristics. In particular, Geostatistics with an extension of the existing spatial techniques into the space-time domain offers some kinds of methods to model such processes. Although these methods for the analysis of spatial-temporal data are becoming more important for many areas of application, they are less developed than those for the analysis of purely spatial or purely temporal data. In this paper, two kinds of spatial-temporal stationary covariance models are introduced. And the differences between spatial domain and time domain are examined. A product-sum covariance model originally given by De Cesare is extended for spatial-temporal analysis on daily rainfall measurements in the three provinces of Northeast China. Remarkably, this generalized non-separable model does not correspond to the use of a metric one in space-time. The rainfall measurements used for this experiment are taken at 104 monitoring stations from January 2000 to December 2005. In the experiment, the product-sum variogram model is employed for developing ordinary kriging and its application to interpolation of the monthly rainfall data from January 2000 to December 2004 has been used to predict the monthly rainfall of 2005. The true values and the predicted ones are compared. The experimental results have shown that this product-sum covariance model is very effective for rainfall analysis.

  15. [When does post-ictal period start in temporal lobe epilepsy? A quantitative EEG perspective].

    PubMed

    Sanz-Garcia, A; Vega-Zelaya, L; Pastor, J; Sola, R G; Ortega, G J

    2017-04-16

    In partial seizures, from a clinical point of view or even from electroencephalography characterization of post-ictal period can be difficult. The spectral and complex network analysis could lead to a more accurate definition of its limits, as well as to a great understanding of the seizures. Digital EEG recordings from scalp and foramen oval electrodes were used, 32 seizures, from 15 patients with drug-resistant mesial temporal lobe epilepsy (Engel I). We analyzed numerically: the spectral entropy, the different frequency bands and several variables used to characterize the cortical network, density of links, modularity, cluster coefficient and average path length. Variations of for post-ictal versus pre-ictal periods were quantified. The cortical network density of links increased during the post-ictal period of complex seizures matching with an spectral entropy decrease, mainly due to an increase in Delta band activity. This variables reached extreme values around one minute after seizure end, defined by classical electroencephalography. Our results can be explained by the appearance of an 'ending' mechanism that starts in the ictal period, classically defined, and reach their maximum effect during the post-ictal period. These results could be useful to define the post-ictal period start, as the moment with maximum synchrony, which has a highest density of links and a lowest spectral entropy.

  16. Using repeated sources to quantitatively determine temporal change of medium properties: Theory and an example

    NASA Astrophysics Data System (ADS)

    Long, Hui; Wen, Lianxing

    2012-09-01

    We develop a theory of using difference wavefields of repeated sources to locate and quantify temporal medium change and apply the theory to locate temporal change of seismic properties beneath the Japan subduction zone using repeated earthquakes. Our theory states the difference wavefields of two repeated sources in a temporally changed medium can be equivalently treated as wavefields propagating from conceptual sources, with their location at the place of temporal change and their strengths equal to the product of magnitude of medium property change and magnitude of the initial wavefields from the repeated sources. When the medium change extends to a finite region, the conceptual sources become volumetric sources distributed over the region of the medium change and propagating in the direction of the initial wave. The conceptualization establishes a theoretical framework for possible applications of using difference wavefields to locate and quantify temporal medium changes in geological sciences, ultrasonic experiments, civil engineering and medical imaging. We search repeating earthquakes occurring in the Japan subduction zone, formulate an empirical procedure to extract the difference wavefields between repeating earthquakes and determine temporal change of seismic properties using a back-projection method. We locate the temporal change of seismic properties beneath the Japan subduction zone to be at (37.2°N, 142°E), and estimate the magnitude of the conceptual body force associated with the temporal change to be 1.15 × 1010N, or as a reference, a 0.87% density change for an assumed volume of temporal change of 103 km3.

  17. Quantitative Analysis of Hypoperfusion in Acute Stroke

    PubMed Central

    Nael, Kambiz; Meshksar, Arash; Liebeskind, David S.; Coull, Bruce M.; Krupinski, Elizabeth A.; Villablanca, J. Pablo

    2014-01-01

    Background and Purpose This study compares the concordance between arterial spin labeling (ASL) and dynamic susceptibility contrast (DSC) for the identification of regional hypoperfusion and diffusion-perfusion mismatch tissue classification using a quantitative method. Methods The inclusion criteria for this retrospective study were as follows: patients with acute ischemic syndrome with symptom onset <24 hours and acquisition of both ASL and DSC MR perfusion. The volumes of infarction and hypoperfused lesions were calculated on ASL and DSC multi-parametric maps. Patients were classified into reperfused, matched, or mismatch groups using time to maximum >6 sec as the reference. In a subset of patients who were successfully recanalized, the identical analysis was performed and the infarction and hypoperfused lesion volumes were used for paired pre- and posttreatment comparisons. Results Forty-one patients met our inclusion criteria. Twenty patients underwent successful endovascular revascularization (TICI>2a), resulting in a total of 61 ASL-DSC data pairs for comparison. The hypoperfusion volume on ASL-cerebral blood flow best approximated the DSC-time to peak volume (r=0.83) in pretreatment group and time to maximum (r=0.46) after recanalization. Both ASL-cerebral blood flow and DSC-TTP overestimated the hypoperfusion volume compared with time to maximum volume in pretreatment (F=27.41, P<0.0001) and recanalized patients (F=8.78, P<0.0001). Conclusions ASL-cerebral blood flow overestimates the DSC time to maximum hypoperfusion volume and mismatch classification in patients with acute ischemic syndrome. Continued overestimation of hypoperfused volume after recanalization suggests flow pattern and velocity changes in addition to arterial transit delay can affects the performance of ASL. PMID:23988646

  18. Temporal Analysis of Windows MRU Registry Keys

    NASA Astrophysics Data System (ADS)

    Zhu, Yuandong; Gladyshev, Pavel; James, Joshua

    The Microsoft Windows registry is an important resource in digital forensic investigations. It contains information about operating system configuration, installed software and user activity. Several researchers have focused on the forensic analysis of the Windows registry, but a robust method for associating past events with registry data values extracted from Windows restore points is not yet available. This paper proposes a novel algorithm for analyzing the most recently used (MRU) keys found in consecutive snapshots of the Windows registry. The algorithm compares two snapshots of the same MRU key and identifies data values within the key that have been updated in the period between the two snapshots. User activities associated with the newly updated data values can be assumed to have occurred during the period between the two snapshots.

  19. Quantitative and temporal definition of the Mla transcriptional regulon during barley-powdery mildew interactions.

    PubMed

    Moscou, Matthew J; Lauter, Nick; Caldo, Rico A; Nettleton, Dan; Wise, Roger P

    2011-06-01

    Barley Mildew resistance locus a (Mla) is a major determinant of immunity to the powdery mildew pathogen, Blumeria graminis f. sp. hordei. Alleles of Mla encode cytoplasmic- and membrane-localized coiled-coil, nucleotide binding site, leucine-rich repeat proteins that mediate resistance when complementary avirulence effectors (AVR(a)) are present in the pathogen. Presence of an appropriate AVR(a) protein triggers nuclear relocalization of MLA, in which MLA binds repressing host transcription factors. Timecourse expression profiles of plants harboring Mla1, Mla6, and Mla12 wild-type alleles versus paired loss-of-function mutants were compared to discover conserved transcriptional targets of MLA and downstream signaling cascades. Pathogen-dependent gene expression was equivalent or stronger in susceptible plants at 20 h after inoculation (HAI) and was attenuated at later timepoints, whereas resistant plants exhibited a time-dependent strengthening of the transcriptional response, increasing in both fold change and the number of genes differentially expressed. Deregulation at 20 HAI implicated 16 HAI as a crucial point in determining the future trajectory of this interaction and was interrogated by quantitative analysis. In total, 28 potential transcriptional targets of the MLA regulon were identified. These candidate targets possess a diverse set of predicted functions, suggesting that multiple pathways are required to mediate the hypersensitive reaction.

  20. Skeleton-based cerebrovascular quantitative analysis.

    PubMed

    Wang, Xingce; Liu, Enhui; Wu, Zhongke; Zhai, Feifei; Zhu, Yi-Cheng; Shui, Wuyang; Zhou, Mingquan

    2016-12-20

    Cerebrovascular disease is the most common cause of death worldwide, with millions of deaths annually. Interest is increasing toward understanding the geometric factors that influence cerebrovascular diseases, such as stroke. Cerebrovascular shape analyses are essential for the diagnosis and pathological identification of these conditions. The current study aimed to provide a stable and consistent methodology for quantitative Circle of Willis (CoW) analysis and to identify geometric changes in this structure. An entire pipeline was designed with emphasis on automating each step. The stochastic segmentation was improved and volumetric data were obtained. The L1 medial axis method was applied to vessel volumetric data, which yielded a discrete skeleton dataset. A B-spline curve was used to fit the skeleton, and geometric values were proposed for a one-dimensional skeleton and radius. The calculations used to derive these values were illustrated in detail. In one example(No. 47 in the open dataset) all values for different branches of CoW were calculated. The anterior communicating artery(ACo) was the shortest vessel, with a length of 2.6mm. The range of the curvature of all vessels was (0.3, 0.9) ± (0.1, 1.4). The range of the torsion was (-12.4,0.8) ± (0, 48.7). The mean radius value range was (3.1, 1.5) ± (0.1, 0.7) mm, and the mean angle value range was (2.2, 2.9) ± (0, 0.2) mm. In addition to the torsion variance values in a few vessels, the variance values of all vessel characteristics remained near 1. The distribution of the radii of symmetrical posterior cerebral artery(PCA) and angle values of the symmetrical posterior communicating arteries(PCo) demonstrated a certain correlation between the corresponding values of symmetrical vessels on the CoW. The data verified the stability of our methodology. Our method was appropriate for the analysis of large medical image datasets derived from the automated pipeline for populations. This method was applicable to

  1. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  2. Quantitative Auger analysis of Nb-Ge superconducting alloys

    SciTech Connect

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb/sub 3/Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements.

  3. Variability of Soil Temperature: A Spatial and Temporal Analysis.

    ERIC Educational Resources Information Center

    Walsh, Stephen J.; And Others

    1991-01-01

    Discusses an analysis of the relationship of soil temperatures at 3 depths to various climatic variables along a 200-kilometer transect in west-central Oklahoma. Reports that temperature readings increased from east to west. Concludes that temperature variations were explained by a combination of spatial, temporal, and biophysical factors. (SG)

  4. Centrality measures in temporal networks with time series analysis

    NASA Astrophysics Data System (ADS)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  5. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  6. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  7. Dissociative mental states are canonically associated with decreased temporal theta activity on spectral analysis of EEG.

    PubMed

    Krüger, Christa; Bartel, Peter; Fletcher, Lizelle

    2013-01-01

    Quantitative electroencephalographic (QEEG) changes relating to dissociative experiences have only rarely been demonstrated, and dissociative states were not quantified in those studies. The aim of this study was to explore concurrent associations between quantified dissociative states and QEEG spectral parameters, in particular theta activity, in psychiatric patients. Fifty psychiatric patients completed the State Scale of Dissociation (SSD) immediately after a 15-min EEG recording. The EEG was assessed by conventional clinical visual analysis as well as by quantitative (QEEG) spectral analysis. Canonical analysis was performed between the set of SSD subscale scores and the following QEEG parameters: alpha-theta magnitude ratios, and relative as well as absolute theta magnitude obtained from right and left mid- to posterior-temporal and parieto-occipital derivations. The SSD transferred well to the present data in terms of reliability and internal criterion-related validity. The SSD and Dissociative Experiences Scale (DES) correlated significantly (r = .73, p < .001). Conventional EEG analysis identified 29 EEGs (58%) as abnormal. The main abnormality in 23 EEGs was slowing, maximal temporally in half of these cases. Canonical analyses confirmed a statistically significant relationship between the dissociation variables (especially conversion and depersonalization symptoms) and the QEEG variables (especially relative theta magnitude in the temporal regions; R = .72, p = .03, for SSD-QEEG; and R = .66, p = .04, for DES-QEEG). Quantified dissociative mental states are positively canonically associated with decreased temporal theta activity and increased alpha-theta ratios on QEEG in psychiatric patients with a high tendency to dissociate. The potential implications of the dissociation-theta-alpha relationship for understanding normal attentional processes need to be studied further.

  8. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  9. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  10. Visualization analysis of multivariate spatial-temporal data of the Red Army Long March in China

    NASA Astrophysics Data System (ADS)

    Ma, Ding; Ma, Zhimin; Meng, Lumin; Li, Xia

    2009-10-01

    Recently, the visualization of spatial-temporal data in historic events is emphasized by more and more people. To provide an efficient and effective approach to meet this requirement is the duty of Geo-data modeling researchers. The aim of the paper is to ground on a new perspective to visualize the multivariate spatial-temporal data of the Red Army Long March, which is one of the most important events of the Chinese modem history. This research focuses on the extraction of relevant information from a 3-dimensional trajectory, which captures object locations in geographic space at specified temporal intervals. However, existing visualization methods cannot deal with the multivariate spatial-temporal data effectively. Thus there is a potential chance to represent and analyze this kind of data in the case study. The thesis combines two visualization methods, the Space-Time-Cube for spatial temporal data and Parallel Coordinates Plots (PCPs) for multivariable data, to develop conceptual GIS database model that facilitates the exploration and analysis of multivariate spatial-temporal data sets in the combination with 3D Space-Time-Path and 2D graphics. The designed model is supported by the geo-visualization environment and integrates diverse sets of multivariate spatial-temporal data and built-up the dynamic process and relationships. It is concluded that this way of geo-visualization can effectively manipulate a large amount of distributed data, realize the high efficient transmission of quantitative and qualitative information and also provide a new research mode in the field of the History of CPC and military affairs.

  11. Effective and efficient analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongnan

    Spatio-temporal data mining, i.e., mining knowledge from large amount of spatio-temporal data, is a highly demanding field because huge amounts of spatio-temporal data have been collected in various applications, ranging from remote sensing, to geographical information systems (GIS), computer cartography, environmental assessment and planning, etc. The collection data far exceeded human's ability to analyze which make it crucial to develop analysis tools. Recent studies on data mining have extended to the scope of data mining from relational and transactional datasets to spatial and temporal datasets. Among the various forms of spatio-temporal data, remote sensing images play an important role, due to the growing wide-spreading of outer space satellites. In this dissertation, we proposed two approaches to analyze the remote sensing data. The first one is about applying association rules mining onto images processing. Each image was divided into a number of image blocks. We built a spatial relationship for these blocks during the dividing process. This made a large number of images into a spatio-temporal dataset since each image was shot in time-series. The second one implemented co-occurrence patterns discovery from these images. The generated patterns represent subsets of spatial features that are located together in space and time. A weather analysis is composed of individual analysis of several meteorological variables. These variables include temperature, pressure, dew point, wind, clouds, visibility and so on. Local-scale models provide detailed analysis and forecasts of meteorological phenomena ranging from a few kilometers to about 100 kilometers in size. When some of above meteorological variables have some special change tendency, some kind of severe weather will happen in most cases. Using the discovery of association rules, we found that some special meteorological variables' changing has tight relation with some severe weather situation that will happen

  12. Analysis of Two Quantitative Ultrasound Approaches.

    PubMed

    Muleki-Seya, Pauline; Han, Aiguo; Andre, Michael P; Erdman, John W; O'Brien, William D

    2017-09-01

    There are two well-known ultrasonic approaches to extract sets of quantitative parameters: Lizzi-Feleppa (LF) parameters: slope, intercept, and midband; and quantitative ultrasound (QUS)-derived parameters: effective scatterer diameter (ESD) and effective acoustic concentration (EAC). In this study, the relation between the LF and QUS-derived parameters is studied theoretically and experimentally on ex vivo mouse livers. As expected from the theory, LF slope is correlated to ESD ([Formula: see text]), and from experimental data, LF midband is correlated to EAC ([Formula: see text]). However, LF intercept is not correlated to ESD ([Formula: see text]) nor EAC ([Formula: see text]). The unexpected correlation observed between LF slope and EAC ([Formula: see text]) results likely from the high correlation between ESD and EAC due to the inversion process. For a liver fat percentage estimation, an important potential medical application, the parameters presenting the better correlation are EAC ([Formula: see text]) and LF midband ([Formula: see text]).

  13. Advancing sensitivity analysis to precisely characterize temporal parameter dominance

    NASA Astrophysics Data System (ADS)

    Guse, Björn; Pfannerstill, Matthias; Strauch, Michael; Reusser, Dominik; Lüdtke, Stefan; Volk, Martin; Gupta, Hoshin; Fohrer, Nicola

    2016-04-01

    Parameter sensitivity analysis is a strategy for detecting dominant model parameters. A temporal sensitivity analysis calculates daily sensitivities of model parameters. This allows a precise characterization of temporal patterns of parameter dominance and an identification of the related discharge conditions. To achieve this goal, the diagnostic information as derived from the temporal parameter sensitivity is advanced by including discharge information in three steps. In a first step, the temporal dynamics are analyzed by means of daily time series of parameter sensitivities. As sensitivity analysis method, we used the Fourier Amplitude Sensitivity Test (FAST) applied directly onto the modelled discharge. Next, the daily sensitivities are analyzed in combination with the flow duration curve (FDC). Through this step, we determine whether high sensitivities of model parameters are related to specific discharges. Finally, parameter sensitivities are separately analyzed for five segments of the FDC and presented as monthly averaged sensitivities. In this way, seasonal patterns of dominant model parameter are provided for each FDC segment. For this methodical approach, we used two contrasting catchments (upland and lowland catchment) to illustrate how parameter dominances change seasonally in different catchments. For all of the FDC segments, the groundwater parameters are dominant in the lowland catchment, while in the upland catchment the controlling parameters change seasonally between parameters from different runoff components. The three methodical steps lead to clear temporal patterns, which represent the typical characteristics of the study catchments. Our methodical approach thus provides a clear idea of how the hydrological dynamics are controlled by model parameters for certain discharge magnitudes during the year. Overall, these three methodical steps precisely characterize model parameters and improve the understanding of process dynamics in hydrological

  14. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  15. Effect of improving spatial or temporal resolution on image quality and quantitative perfusion assessment with k-t SENSE acceleration in first-pass CMR myocardial perfusion imaging.

    PubMed

    Maredia, Neil; Radjenovic, Aleksandra; Kozerke, Sebastian; Larghat, Abdulghani; Greenwood, John P; Plein, Sven

    2010-12-01

    k-t Sensitivity-encoded (k-t SENSE) acceleration has been used to improve spatial resolution, temporal resolution, and slice coverage in first-pass cardiac magnetic resonance myocardial perfusion imaging. This study compares the effect of investing the speed-up afforded by k-t SENSE acceleration in spatial or temporal resolution. Ten healthy volunteers underwent adenosine stress myocardial perfusion imaging using four saturation-recovery gradient echo perfusion sequences: a reference sequence accelerated by sensitivity encoding (SENSE), and three k-t SENSE-accelerated sequences with higher spatial resolution ("k-t High"), shorter acquisition window ("k-t Fast"), or a shared increase in both parameters ("k-t Hybrid") relative to the reference. Dark-rim artifacts and image quality were analyzed. Semiquantitative myocardial perfusion reserve index (MPRI) and Fermi-derived quantitative MPR were also calculated. The k-t Hybrid sequence produced highest image quality scores at rest (P = 0.015). Rim artifact thickness and extent were lowest using k-t High and k-t Hybrid sequences (P < 0.001). There were no significant differences in MPRI and MPR values derived by each sequence. Maximizing spatial resolution by k-t SENSE acceleration produces the greatest reduction in dark rim artifact. There is good agreement between k-t SENSE and standard acquisition methods for semiquantitative and fully quantitative myocardial perfusion analysis.

  16. Investigation of Portevin-Le Chatelier band with temporal phase analysis of speckle interferometry

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenyu; Zhang, Qingchuan; Wu, Xiaoping

    2003-04-01

    A new method combining temporal phase analysis with dynamic digital speckle pattern interferometry is proposed to study Portevin-Le Chatelier effect quantitatively. The principle bases on that the phase difference of interference speckle patterns is a time-dependent function related to the object deformation. The interference speckle patterns of specimen are recorded with high sampling rate while PLC effect occurs, and the 2D displacement map of PLC band and its width are obtained by analyzing the displacement of specimen with proposed method.

  17. Quantitative measurement of zinc secretion from pancreatic islets with high temporal resolution using droplet-based microfluidics

    PubMed Central

    Easley, Christopher J.; Rocheleau, Jonathan V.; Head, W. Steven; Piston, David W.

    2009-01-01

    We assayed glucose-stimulated insulin secretion (GSIS) from live, murine islets of Langerhans in microfluidic devices by the downstream formation of aqueous droplets. Zinc ions, which are co-secreted with insulin from β-cells, were quantitatively measured from single islets with high temporal resolution using a fluorescent indicator, FluoZin-3. Real-time storage of secretions into droplets (volume of 0.470 ± 0.009 nL) effectively preserves the temporal chemical information, allowing reconstruction of the secretory time record. The use of passive flow control within the device removes the need for syringe pumps, requiring only a single handheld syringe. Under stimulatory glucose levels (11 mM), bursts of zinc as high as ~800 fg islet−1 min−1 were measured. Treatment with diazoxide effectively blocked zinc secretion, as expected. High temporal resolution reveals two major classes of oscillations in secreted zinc, with predominate periods at ~20-40 s and ~ 5-10 min. The more rapid oscillation periods match closely with those of intraislet calcium oscillations, while the slower oscillations are consistent with insulin pulses typically measured in bulk islet experiments or in the bloodstream. This droplet sampling technique should be widely applicable to time-resolved cellular secretion measurements, either in real-time or for post-processing. PMID:19874061

  18. Temporal quantitation of mutant Kit tyrosine kinase signaling attenuated by a novel thiophene kinase inhibitor OSI-930.

    PubMed

    Petti, Filippo; Thelemann, April; Kahler, Jen; McCormack, Siobhan; Castaldo, Linda; Hunt, Tony; Nuwaysir, Lydia; Zeiske, Lynn; Haack, Herbert; Sullivan, Laura; Garton, Andrew; Haley, John D

    2005-08-01

    OSI-930, a potent thiophene inhibitor of the Kit, KDR, and platelet-derived growth factor receptor tyrosine kinases, was used to selectively inhibit tyrosine phosphorylation downstream of juxtamembrane mutant Kit in the mast cell leukemia line HMC-1. Inhibition of Kit kinase activity resulted in a rapid dephosphorylation of Kit and inhibition of the downstream signaling pathways. Attenuation of Ras-Raf-Erk (phospho-Erk, phospho-p38), phosphatidyl inositol-3' kinase (phospho-p85, phospho-Akt, phospho-S6), and signal transducers and activators of transcription signaling pathways (phospho-STAT3/5/6) were measured by affinity liquid chromatography tandem mass spectrometry, by immunoblot, and by tissue microarrays of fixed cell pellets. To more globally define additional components of Kit signaling temporally altered by kinase inhibition, a novel multiplex quantitative isobaric peptide labeling approach was used. This approach allowed clustering of proteins by temporal expression patterns. Kit kinase, which dephosphorylates rapidly upon kinase inhibition, was shown to regulate both Shp-1 and BDP-1 tyrosine phosphatases and the phosphatase-interacting protein PSTPIP2. Interactions with SH2 domain adapters [growth factor receptor binding protein 2 (Grb2), Cbl, Slp-76] and SH3 domain adapters (HS1, cortactin, CD2BP3) were attenuated by inhibition of Kit kinase activity. Functional crosstalk between Kit and the non-receptor tyrosine kinases Fes/Fps, Fer, Btk, and Syk was observed. Inhibition of Kit modulated phosphorylation-dependent interactions with pathways controlling focal adhesion (paxillin, leupaxin, p130CAS, FAK1, the Src family kinase Lyn, Wasp, Fhl-3, G25K, Ack-1, Nap1, SH3P12/ponsin) and septin-actin complexes (NEDD5, cdc11, actin). The combined use of isobaric protein quantitation and expression clustering, immunoblot, and tissue microarray strategies allowed temporal measurement signaling pathways modulated by mutant Kit inhibition in a model of mast cell

  19. Hierarchical Aligned Cluster Analysis for Temporal Clustering of Human Motion.

    PubMed

    Zhou, Feng; De la Torre, Fernando; Hodgins, Jessica K

    2013-03-01

    Temporal segmentation of human motion into plausible motion primitives is central to understanding and building computational models of human motion. Several issues contribute to the challenge of discovering motion primitives: the exponential nature of all possible movement combinations, the variability in the temporal scale of human actions, and the complexity of representing articulated motion. We pose the problem of learning motion primitives as one of temporal clustering, and derive an unsupervised hierarchical bottom-up framework called hierarchical aligned cluster analysis (HACA). HACA finds a partition of a given multidimensional time series into m disjoint segments such that each segment belongs to one of k clusters. HACA combines kernel k-means with the generalized dynamic time alignment kernel to cluster time series data. Moreover, it provides a natural framework to find a low-dimensional embedding for time series. HACA is efficiently optimized with a coordinate descent strategy and dynamic programming. Experimental results on motion capture and video data demonstrate the effectiveness of HACA for segmenting complex motions and as a visualization tool. We also compare the performance of HACA to state-of-the-art algorithms for temporal clustering on data of a honey bee dance. The HACA code is available online.

  20. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  1. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  2. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  3. Proxy-to-proxy calibration: increasing the temporal resolution of quantitative climate reconstructions.

    PubMed

    von Gunten, Lucien; D'Andrea, William J; Bradley, Raymond S; Huang, Yongsong

    2012-01-01

    High-resolution paleoclimate reconstructions are often restricted by the difficulties of sampling geologic archives in great detail and the analytical costs of processing large numbers of samples. Using sediments from Lake Braya Sø, Greenland, we introduce a new method that provides a quantitative high-resolution paleoclimate record by combining measurements of the alkenone unsaturation index (U37(K)) with non-destructive scanning reflectance spectroscopic measurements in the visible range (VIS-RS). The proxy-to-proxy (PTP) method exploits two distinct calibrations: the in situ calibration of U37(K) to lake water temperature and the calibration of scanning VIS-RS data to down core U37(K) data. Using this approach, we produced a quantitative temperature record that is longer and has 5 times higher sampling resolution than the original U37(K) time series, thereby allowing detection of temperature variability in frequency bands characteristic of the AMO over the past 7,000 years.

  4. Proxy-to-proxy calibration: Increasing the temporal resolution of quantitative climate reconstructions

    PubMed Central

    von Gunten, Lucien; D'Andrea, William J.; Bradley, Raymond S.; Huang, Yongsong

    2012-01-01

    High-resolution paleoclimate reconstructions are often restricted by the difficulties of sampling geologic archives in great detail and the analytical costs of processing large numbers of samples. Using sediments from Lake Braya Sø, Greenland, we introduce a new method that provides a quantitative high-resolution paleoclimate record by combining measurements of the alkenone unsaturation index () with non-destructive scanning reflectance spectroscopic measurements in the visible range (VIS-RS). The proxy-to-proxy (PTP) method exploits two distinct calibrations: the in situ calibration of to lake water temperature and the calibration of scanning VIS-RS data to down core data. Using this approach, we produced a quantitative temperature record that is longer and has 5 times higher sampling resolution than the original time series, thereby allowing detection of temperature variability in frequency bands characteristic of the AMO over the past 7,000 years. PMID:22934132

  5. Quantitative imaging with Fucci and mathematics to uncover temporal dynamics of cell cycle progression.

    PubMed

    Saitou, Takashi; Imamura, Takeshi

    2016-01-01

    Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation.

  6. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system.

  7. Temporal eating patterns: a latent class analysis approach.

    PubMed

    Leech, Rebecca M; Worsley, Anthony; Timperio, Anna; McNaughton, Sarah A

    2017-01-07

    There is some evidence that large energy intakes towards the end of the day are associated with adverse health outcomes, however, studies of temporal eating patterns across the day are rare. This study examines the temporal eating patterns of Australian adults using latent class analysis (LCA), as a novel approach. Dietary data (n = 2402 men and n = 2840 women, ≥19 years) from two 24-h recalls collected during the 2011-12 Australian National Nutrition and Physical Activity Survey were analyzed. LCA was performed to identify distinct temporal eating patterns based on whether or not an eating occasion (EO) occurred within each hour of the day. F and adjusted-chi(2) tests assessed differences in sociodemographic and eating patterns (e.g., meal, snack and EO frequency) between latent classes. Three patterns, labelled "Conventional" (men: 43%, women: 41%), "Later lunch" (men: 34%, women: 34%) and "Grazing" (men: 23%, women: 25%) were identified. Men and women with a "Grazing" pattern were significantly younger (P < 0.001) and a higher proportion were from major cities (P < 0.01) and were not married (men only, P = 0.01), compared to the "Conventional" and "Later lunch" patterns. The "Grazing" pattern was also characterized by a higher EO frequency (P < 0.01) and snack frequency (P < 0.001) and consumption of a higher proportion of total energy intake from snacks but a lower proportion of total energy intake from meals (P < 0.001). This study identified three distinct temporal eating patterns in adults that varied by age, EO frequency, snack frequency and energy intake pattern. LCA is a useful approach to capture differences in EO timing across the day. Future research should examine associations between temporal eating patterns and health.

  8. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  9. Pharmacoproteomic study of the natural product Ebenfuran III in DU-145 prostate cancer cells: the quantitative and temporal interrogation of chemically induced cell death at the protein level.

    PubMed

    Roumeliotis, Theodoros I; Halabalaki, Maria; Alexi, Xanthippi; Ankrett, Dyan; Giannopoulou, Eugenia G; Skaltsounis, Alexios-Leandros; Sayan, Berna S; Alexis, Michael N; Townsend, Paul A; Garbis, Spiros D

    2013-04-05

    A naturally occurring benzofuran derivative, Ebenfuran III (Eb III), was investigated for its antiproliferative effects using the DU-145 prostate cell line. Eb III was isolated from Onobrychis ebenoides of the Leguminosae family, a plant endemic in Central and Southern Greece. We have previously reported that Eb III exerts significant cytotoxic effects on certain cancer cell lines. This effect is thought to occur via the isoprenyl moiety at the C-5 position of the molecule. The study aim was to gain a deeper understanding of the pharmacological effect of Eb III on DU-145 cell death at the translational level using a relative quantitative and temporal proteomics approach. Proteins extracted from the cell pellets were subjected to solution phase trypsin proteolysis followed by iTRAQ-labeling. The labeled tryptic peptide extracts were then fractionated using strong cation exchange chromatography and the fractions were analyzed by nanoflow reverse phase ultraperformance liquid chromatography-nanoelectrospray ionization-tandem mass spectrometry analysis using a hybrid QqTOF platform. Using this approach, we compared the expression levels of 1360 proteins analyzed at ≤ 1% global protein false discovery rate (FDR), commonly present in untreated (control, vehicle only) and Eb III-treated cells at the different exposure time points. Through the iterative use of Ingenuity Pathway Analysis with hierarchical clustering of protein expression patterns, followed by bibliographic research, the temporal regulation of the Calpain-1, ERK2, PAR-4, RAB-7, and Bap31 proteins were identified as potential nodes of multipathway convergence to Eb III induced DU-145 cell death. These proteins were further verified with Western blot analysis. This gel-free, quantitative 2DLC-MS/MS proteomics method effectively captured novel modulated proteins in the DU-145 cell line as a response to Eb III treatment. This approach also provided greater insight to the multifocal and combinatorial signaling

  10. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  11. Quantitative Analysis of HIV-1 Preintegration Complexes

    PubMed Central

    Engelman, Alan; Oztop, Ilker; Vandegraaff, Nick; Raghavendra, Nidhanapati K.

    2009-01-01

    Retroviral replication proceeds through the formation of a provirus, an integrated DNA copy of the viral RNA genome. The linear cDNA product of reverse transcription is the integration substrate and two different integrase activities, 3′ processing and DNA strand transfer, are required for provirus formation. Integrase nicks the cDNA ends adjacent to phylogenetically-conserved CA dinucleotides during 3′ processing. After nuclear entry and locating a suitable chromatin acceptor site, integrase joins the recessed 3′-OHs to the 5′-phosphates of a double-stranded staggered cut in the DNA target. Integrase functions in the context of a large nucleoprotein complex, called the preintegration complex (PIC), and PICs are analyzed to determine levels of integrase 3′ processing and DNA strand transfer activities that occur during acute virus infection. Denatured cDNA end regions are monitored by indirect end-labeling to measure the extent of 3′ processing. Native PICs can efficiently integrate their viral cDNA into exogenously added target DNA in vitro, and Southern blotting or nested PCR assays are used to quantify the resultant DNA strand transfer activity. This study details HIV-1 infection, PIC extraction, partial purification, and quantitative analyses of integrase 3′ processing and DNA strand transfer activities. PMID:19233280

  12. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  13. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  14. Multiscale recurrence analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Marwan, N.; Kurths, J.

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  15. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy.

    PubMed

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  16. Temporal MDS Plots for Analysis of Multivariate Data.

    PubMed

    Jäckle, Dominik; Fischer, Fabian; Schreck, Tobias; Keim, Daniel A

    2016-01-01

    Multivariate time series data can be found in many application domains. Examples include data from computer networks, healthcare, social networks, or financial markets. Often, patterns in such data evolve over time among multiple dimensions and are hard to detect. Dimensionality reduction methods such as PCA and MDS allow analysis and visualization of multivariate data, but per se do not provide means to explore multivariate patterns over time. We propose Temporal Multidimensional Scaling (TMDS), a novel visualization technique that computes temporal one-dimensional MDS plots for multivariate data which evolve over time. Using a sliding window approach, MDS is computed for each data window separately, and the results are plotted sequentially along the time axis, taking care of plot alignment. Our TMDS plots enable visual identification of patterns based on multidimensional similarity of the data evolving over time. We demonstrate the usefulness of our approach in the field of network security and show in two case studies how users can iteratively explore the data to identify previously unknown, temporally evolving patterns.

  17. Temporal Analysis of Atmospheric Data Using Open Standards

    NASA Astrophysics Data System (ADS)

    Campalani, P.; Beccati, A.; Mantovani, S.; Baumann, P.

    2014-04-01

    The continuous growth of remotely sensed data raises the need for efficient ways of accessing data archives. The classical model of accessing remote sensing (satellite) archives via distribution of large files is increasingly making way for a more dynamic and interactive data service. A challenge, though, is interoperability of such services, in particular when multi-dimensional data and advanced processing are involved. Individually crafted service interfaces typically do not allow substitution and combination of services. Open standards can provide a way forward if they are powerful enough to address both data and processing model. The OGC Web Coverage Service (WCS) is a modular service suite which provides high-level interface definitions for data access, subsetting, filtering, and processing of spatio-temporal raster data. WCS based service interfaces to data archives deliver data in their original semantics useful for further client-side processing, as opposed to the Web Map Service (WMS) (de la Beaujardière, 2006) which performs a pre-rendering into images only useful for display to humans. In this paper we present a case study where the OGC coverage data and service model defines the client/server interface for a climate data service. In particular, we show how flexible temporal analysis can be performed efficiently on massive spatio-temporal coverage objects. This service, which is operational on a several Terabyte data holding, has been established as part of the EarthServer initiative focusing on Big Data in the Earth and Planetary sciences.

  18. Quantitive and Sociological Analysis of Blog Networks

    NASA Astrophysics Data System (ADS)

    Bachnik, W.; Szymczyk, S.; Leszczynski, S.; Podsiadlo, R.; Rymszewicz, E.; Kurylo, L.; Makowiec, D.; Bykowska, B.

    2005-10-01

    This paper examines the emerging phenomenon of blogging, using three different Polish blogging services as the base of the research. Authors show that blog networks are sharing their characteristics with complex networks (gamma coefficients, small worlds, cliques, etc.). Elements of sociometric analysis were used to prove existence of some social structures in the blog networks.

  19. Quantitative measurement of intracellular transport of nanocarriers by spatio-temporal image correlation spectroscopy

    NASA Astrophysics Data System (ADS)

    Coppola, S.; Pozzi, D.; Candeloro De Sanctis, S.; Digman, M. A.; Gratton, E.; Caracciolo, G.

    2013-03-01

    Spatio-temporal image correlation spectroscopy (STICS) is a powerful technique for assessing the nature of particle motion in complex systems although it has been rarely used to investigate the intracellular dynamics of nanocarriers so far. Here we introduce a method for characterizing the mode of motion of nanocarriers and for quantifying their transport parameters on different length scales from single-cell to subcellular level. Using this strategy we were able to study the mechanisms responsible for the intracellular transport of DOTAP-DOPC/DNA (DOTAP: 1,2-dioleoyl-3-trimethylammonium-propane; DOPC: dioleoylphosphocholine) and DC-Chol-DOPE/DNA (DC-Chol: 3β-[N-(N,N-dimethylaminoethane)-carbamoyl] cholesterol; DOPE: dioleoylphosphatidylethanolamine) lipoplexes in CHO-K1 (CHO: Chinese hamster ovary) live cells. Measurement of both diffusion coefficients and velocity vectors (magnitude and direction) averaged over regions of the cell revealed the presence of distinct modes of motion. Lipoplexes diffused slowly on the cell surface (diffusion coefficient: D ≈ 0.003 μm2 s-1). In the cytosol, the lipoplexes’ motion was characterized by active transport with average velocity v ≈ 0.03 μm2 s-1 and random motion. The method permitted us to generate an intracellular transport map showing several regions of concerted motion of lipoplexes.

  20. Quantitative analysis of Li by PIGE technique

    NASA Astrophysics Data System (ADS)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  1. Atherosclerotic plaque characterization by spatial and temporal speckle pattern analysis

    NASA Astrophysics Data System (ADS)

    Tearney, Guillermo J.; Bouma, Brett E.

    2002-04-01

    Improved methods are needed to identify the vulnerable coronary plaques responsible for acute myocardial infraction or sudden cardiac death. We describe a method for characterizing the structure and biomechanical properties of atherosclerotic plaques based on speckle pattern fluctuations. Near-field speckle images were acquired from five human aortic specimens ex vivo. The speckle decorrelation time constant varied significantly for vulnerable aortic plaques (τ = 40 ms) versus stable plaques (τ = 400 ms) and normal aorta (τ = 500 ms). These initial results indicate that different atherosclerotic plaque types may be distinguished by analysis of temporal and spatial speckle pattern fluctuations.

  2. Systems nanobiology: from quantitative single molecule biophysics to microfluidic-based single cell analysis.

    PubMed

    Martini, Joerg; Hellmich, Wibke; Greif, Dominik; Becker, Anke; Merkle, Thomas; Ros, Robert; Ros, Alexandra; Toensing, Katja; Anselmetti, Dario

    2007-01-01

    Detailed and quantitative information about structure-function relation, concentrations and interaction kinetics of biological molecules and subcellular components is a key prerequisite to understand and model cellular organisation and temporal dynamics. In systems nanobi-ology, cellular processes are quantitatively investigated at the sensitivity level of single molecules and cells. This approach provides direct access to biomolecular information without being statistically ensemble-averaged, their associated distribution functions, and possible subpopulations. Moreover at the single cell level, the interplay of regulated genomic information and proteomic variabilities can be investigated and attributed to functional peculiarities. These requirements necessitate the development of novel and ultrasensitive methods and instruments for single molecule detection, microscopy and spectroscopy for analysis without the need of amplification and preconcentration. In this chapter, we present three methodological applications that demonstrate how quantitative informations can be accessed that are representative for cellular processes or single cell analysis like gene expression regulation, intracellular protein translocation dynamics, and single cell protein fingerprinting. First, the interaction kinetics of transcriptionally regulated DNA-protein interaction can be quantitatively investigated with single molecule force spectroscopy allowing a molecular affinity ranking. Second, intracellular protein dynamics for a transcription regulator migrating form the nucleus to the cytoplasm can be quantitatively monitored by photoactivable GFP and two-photon laser scanning microscopy. And third, a microfluidic-based method for label-free single cell proteomics and fingerprinting and first label-free single cell electropherograms are presented which include the manipulation and steering of single cells in a microfluidic device.

  3. Quantitative Analysis of Immunohistochemistry in Melanoma Tumors.

    PubMed

    Lilyquist, Jenna; White, Kirsten Anne Meyer; Lee, Rebecca J; Philips, Genevieve K; Hughes, Christopher R; Torres, Salina M

    2017-04-01

    Identification of positive staining is often qualitative and subjective. This is particularly troublesome in pigmented melanoma lesions, because melanin is difficult to distinguish from the brown stain resulting from immunohistochemistry (IHC) using horse radish peroxidase developed with 3,3'-Diaminobenzidine (HRP-DAB). We sought to identify and quantify positive staining, particularly in melanoma lesions. We visualized G-protein coupled estrogen receptor (GPER) expression developed with HRP-DAB and counterstained with Azure B (stains melanin) in melanoma tissue sections (n = 3). Matched sections (n = 3), along with 22 unmatched sections, were stained only with Azure B as a control. Breast tissue (n = 1) was used as a positive HRP-DAB control. Images of the stained tissues were generated using a Nuance Spectral Imaging Camera. Analysis of the images was performed using the Nuance Spectral Imaging software and SlideBook. Data was analyzed using a Kruskal-Wallis one way analysis of variance (ANOVA). We showed that a pigmented melanoma tissue doubly stained with anti-GPER HRP-DAB and Azure B can be unmixed using spectra derived from a matched, Azure B-only section, and an anti-GPER HRP-DAB control. We unmixed each of the melanoma lesions using each of the Azure B spectra, evaluated the mean intensity of positive staining, and examined the distribution of the mean intensities (P = .73; Kruskal-Wallis). These results suggest that this method does not require a matched Azure B-only stained control tissue for every melanoma lesion, allowing precious tissues to be conserved for other studies. Importantly, this quantification method reduces the subjectivity of protein expression analysis, and provides a valuable tool for accurate evaluation, particularly for pigmented tissues.

  4. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  5. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  6. Quantitative Analysis in Nuclear Medicine Imaging

    NASA Astrophysics Data System (ADS)

    Zaidi, Habib

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases.

  7. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  8. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  9. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  10. Spatially and temporally varying selection on intrapopulation quantitative trait loci for a life history trade-off in Mimulus guttatus.

    PubMed

    Mojica, Julius P; Lee, Young Wha; Willis, John H; Kelly, John K

    2012-08-01

    Why do populations remain genetically variable despite strong continuous natural selection? Mutation reconstitutes variation eliminated by selection and genetic drift, but theoretical and experimental studies each suggest that mutation-selection balance insufficient to explain extant genetic variation in most complex traits. The alternative hypothesis of balancing selection, wherein selection maintains genetic variation, is an aggregate of multiple mechanisms (spatial and temporal heterogeneity in selection, frequency-dependent selection, antagonistic pleiotropy, etc.). Most of these mechanisms have been demonstrated for Mendelian traits, but there is little comparable data for loci affecting quantitative characters. Here, we report a 3-year field study of selection on intrapopulation quantitative trait loci (QTL) of flower size, a highly polygenic trait in Mimulus guttatus. The QTL exhibit antagonistic pleiotropy: alleles that increase flower size, reduce viability, but increase fecundity. The magnitude and direction of selection fluctuates yearly and on a spatial scale of metres. This study provides direct evidence of balancing selection mechanisms on QTL of an ecologically relevant trait. © 2012 Blackwell Publishing Ltd.

  11. Two developmentally temporal quantitative trait loci underlie convergent evolution of increased branchial bone length in sticklebacks.

    PubMed

    Erickson, Priscilla A; Glazer, Andrew M; Cleves, Phillip A; Smith, Alyson S; Miller, Craig T

    2014-08-07

    In convergent evolution, similar phenotypes evolve repeatedly in independent populations, often reflecting adaptation to similar environments. Understanding whether convergent evolution proceeds via similar or different genetic and developmental mechanisms offers insight towards the repeatability and predictability of evolution. Oceanic populations of threespine stickleback fish, Gasterosteus aculeatus, have repeatedly colonized countless freshwater lakes and streams, where new diets lead to morphological adaptations related to feeding. Here, we show that heritable increases in branchial bone length have convergently evolved in two independently derived freshwater stickleback populations. In both populations, an increased bone growth rate in juveniles underlies the convergent adult phenotype, and one population also has a longer cartilage template. Using F2 crosses from these two freshwater populations, we show that two quantitative trait loci (QTL) control branchial bone length at distinct points in development. In both populations, a QTL on chromosome 21 controls bone length throughout juvenile development, and a QTL on chromosome 4 controls bone length only in adults. In addition to these similar developmental profiles, these QTL show similar chromosomal locations in both populations. Our results suggest that sticklebacks have convergently evolved longer branchial bones using similar genetic and developmental programmes in two independently derived populations. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  12. Personalized structural image analysis in patients with temporal lobe epilepsy.

    PubMed

    Rummel, Christian; Slavova, Nedelina; Seiler, Andrea; Abela, Eugenio; Hauf, Martinus; Burren, Yuliya; Weisstanner, Christian; Vulliemoz, Serge; Seeck, Margitta; Schindler, Kaspar; Wiest, Roland

    2017-09-07

    Volumetric and morphometric studies have demonstrated structural abnormalities related to chronic epilepsies on a cohort- and population-based level. On a single-patient level, specific patterns of atrophy or cortical reorganization may be widespread and heterogeneous but represent potential targets for further personalized image analysis and surgical therapy. The goal of this study was to compare morphometric data analysis in 37 patients with temporal lobe epilepsies with expert-based image analysis, pre-informed by seizure semiology and ictal scalp EEG. Automated image analysis identified abnormalities exceeding expert-determined structural epileptogenic lesions in 86% of datasets. If EEG lateralization and expert MRI readings were congruent, automated analysis detected abnormalities consistent on a lobar and hemispheric level in 82% of datasets. However, in 25% of patients EEG lateralization and expert readings were inconsistent. Automated analysis localized to the site of resection in 60% of datasets in patients who underwent successful epilepsy surgery. Morphometric abnormalities beyond the mesiotemporal structures contributed to subtype characterisation. We conclude that subject-specific morphometric information is in agreement with expert image analysis and scalp EEG in the majority of cases. However, automated image analysis may provide non-invasive additional information in cases with equivocal radiological and neurophysiological findings.

  13. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  14. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  15. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  16. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  17. An adaptive gyroscope-based algorithm for temporal gait analysis.

    PubMed

    Greene, Barry R; McGrath, Denise; O'Neill, Ross; O'Donovan, Karol J; Burns, Adrian; Caulfield, Brian

    2010-12-01

    Body-worn kinematic sensors have been widely proposed as the optimal solution for portable, low cost, ambulatory monitoring of gait. This study aims to evaluate an adaptive gyroscope-based algorithm for automated temporal gait analysis using body-worn wireless gyroscopes. Gyroscope data from nine healthy adult subjects performing four walks at four different speeds were then compared against data acquired simultaneously using two force plates and an optical motion capture system. Data from a poliomyelitis patient, exhibiting pathological gait walking with and without the aid of a crutch, were also compared to the force plate. Results show that the mean true error between the adaptive gyroscope algorithm and force plate was -4.5 ± 14.4 ms and 43.4 ± 6.0 ms for IC and TC points, respectively, in healthy subjects. Similarly, the mean true error when data from the polio patient were compared against the force plate was -75.61 ± 27.53 ms and 99.20 ± 46.00 ms for IC and TC points, respectively. A comparison of the present algorithm against temporal gait parameters derived from an optical motion analysis system showed good agreement for nine healthy subjects at four speeds. These results show that the algorithm reported here could constitute the basis of a robust, portable, low-cost system for ambulatory monitoring of gait.

  18. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  19. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  20. Quantitative flow cytometric analysis of membrane antigen expression.

    PubMed

    D'hautcourt, Jean-Luc

    2002-11-01

    Immunological analysis for cell antigens has been performed by flow cytometry in a qualitative fashion for over thirty years. During that time it has become increasingly apparent that quantitative measurements such as number of antigens per cell provide unique and useful information. This unit on quantitative flow cytometry (QFCM) describes the most commonly used protocols, both direct and indirect, and the major methods of analysis for the number of antibody binding sites on a cell or particle. Practical applications include detection of antigen under- or overexpression in hematological malignancies, distinguishing between B cell lymphoproliferative disorders, and precise diagnosis of certain rare diseases.

  1. Evaluation of focus laterality in temporal lobe epilepsy: a quantitative study comparing double inversion-recovery MR imaging at 3T with FDG-PET.

    PubMed

    Morimoto, Emiko; Okada, Tomohisa; Kanagaki, Mitsunori; Yamamoto, Akira; Fushimi, Yasutaka; Matsumoto, Riki; Takaya, Shigetoshi; Ikeda, Akio; Kunieda, Takeharu; Kikuchi, Takayuki; Paul, Dominik; Miyamoto, Susumu; Takahashi, Ryosuke; Togashi, Kaori

    2013-12-01

    To quantitatively compare the diagnostic capability of double inversion-recovery (DIR) with F-18 fluorodeoxyglucose positron emission tomography (FDG-PET) for detection of seizure focus laterality in temporal lobe epilepsy (TLE). This study was approved by the institutional review board, and written informed consent was obtained. Fifteen patients with TLE and 38 healthy volunteers were enrolled. All magnetic resonance (MR) images were acquired using a 3T-MRI system. Voxel-based analysis (VBA) was conducted for FDG-PET images and white matter segments of DIR images (DIR-WM) focused on the whole temporal lobe (TL) and the anterior part of the temporal lobe (ATL). Distribution of hypometabolic areas on FDG-PET and increased signal intensity areas on DIR-WM were evaluated, and their laterality was compared with clinically determined seizure focus laterality. Correct diagnostic rates of laterality were evaluated, and agreement between DIR-WM and FDG-PET was assessed using κ statistics. Increased signal intensity areas on DIR-WM were located at the vicinity of the hypometabolic areas on FDG-PET, especially in the ATL. Correct diagnostic rates of seizure focus laterality for DIR-WM (0.80 and 0.67 for the TL and the ATL, respectively) were slightly higher than those for FDG-PET (0.67 and 0.60 for the TL and the ATL, respectively). Agreement of laterality between DIR-WM and FDG-PET was substantial for the TL and almost perfect for the ATL (κ = 0.67 and 0.86, respectively). High agreement in localization between DIR-WM and FDG-PET and nearly equivalent detectability of them show us an additional role of MRI in TLE. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.

  2. Experiments With Temporal Reasoning Applied To Analysis Of Telemetry Data

    NASA Astrophysics Data System (ADS)

    Perkins, W. A.; Austin, A.

    1987-10-01

    Many applications of expert systems to Space Station Automation, such as monitoring, planning, and scheduling will involve reasoning about attributes of objects at different times. For example, in monitoring, the system must reason about changes in signal parameters over time because causal relationships among events are important. In order to reason efficiently and concurrently about attributes with different values at different times, different time formats, and different time validity conditions requires more complex knowledge representations than are generally available in expert systems. Representation issues dealing with point times, intervals, and relative times must also be resolved. We have implemented a temporal reasoning capability in a generic expert system shell (LES) to address these issues and to increase the flexibility of the knowledge representation for a variety of applications. For its first application, we chose monitoring of telemetry data from a satellite (the Space Telescope). Our work involved just the RCE (Rotor Controlled Electronics) bearing, a component of the reaction-wheels subsystem which has attributes such as ACTUAL-TEMPERATURE of the bearing, WHEEL-SPEED, and MOTOR-CURRENT. This task consists of collecting one attribute value per sensor per cycle, checking each value to see if it is within the acceptable range, and storing the each value with a time tag in the database. Processing becomes more complex when one or more readings are out of their acceptable range. The analysis to discover the cause involves examining several cycles of readings, as well as comparing the readings of different sensors over time. The temporal reasoning capability in LES allowed us to compare the most recent readings of two sensors; or to compare one current reading with a value collected some time earlier; or to collect several consecutive readings which are analyzed for trends. In addition, having time tags associated with attribute values permitted us

  3. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  4. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  5. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  6. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  7. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  8. Quantitating the subtleties of microglial morphology with fractal analysis.

    PubMed

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  9. Looking for complexity in quantitative semiology of frontal and temporal lobe seizures using neuroethology and graph theory.

    PubMed

    Bertti, Poliana; Tejada, Julian; Martins, Ana Paula Pinheiro; Dal-Cól, Maria Luiza Cleto; Terra, Vera Cristina; de Oliveira, José Antônio Cortes; Velasco, Tonicarlo Rodrigues; Sakamoto, Américo Ceiki; Garcia-Cairasco, Norberto

    2014-09-01

    Epileptic syndromes and seizures are the expression of complex brain systems. Because no analysis of complexity has been applied to epileptic seizure semiology, our goal was to apply neuroethology and graph analysis to the study of the complexity of behavioral manifestations of epileptic seizures in human frontal lobe epilepsy (FLE) and temporal lobe epilepsy (TLE). We analyzed the video recordings of 120 seizures of 18 patients with FLE and 28 seizures of 28 patients with TLE. All patients were seizure-free >1 year after surgery (Engel Class I). All patients' behavioral sequences were analyzed by means of a glossary containing all behaviors and analyzed for neuroethology (Ethomatic software). The same series were used for graph analysis (CYTOSCAPE). Behaviors, displayed as nodes, were connected by edges to other nodes according to their temporal sequence of appearance. Using neuroethology analysis, we confirmed data in the literature such as in FLE: brief/frequent seizures, complex motor behaviors, head and eye version, unilateral/bilateral tonic posturing, speech arrest, vocalization, and rapid postictal recovery and in the case of TLE: presence of epigastric aura, lateralized dystonias, impairment of consciousness/speech during ictal and postictal periods, and development of secondary generalization. Using graph analysis metrics of FLE and TLE confirmed data from flowcharts. However, because of the algorithms we used, they highlighted more powerfully the connectivity and complex associations among behaviors in a quite selective manner, depending on the origin of the seizures. The algorithms we used are commonly employed to track brain connectivity from EEG and MRI sources, which makes our study very promising for future studies of complexity in this field. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  11. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  12. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  13. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  14. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  15. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  16. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  17. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  18. Brain regions involved in human movement perception: a quantitative voxel-based meta-analysis.

    PubMed

    Grosbras, Marie-Hélène; Beaton, Susan; Eickhoff, Simon B

    2012-02-01

    Face, hands, and body movements are powerful signals essential for social interactions. In the last 2 decades, a large number of brain imaging studies have explored the neural correlates of the perception of these signals. Formal synthesis is crucially needed, however, to extract the key circuits involved in human motion perception across the variety of paradigms and stimuli that have been used. Here, we used the activation likelihood estimation (ALE) meta-analysis approach with random effect analysis. We performed meta-analyses on three classes of biological motion: movement of the whole body, hands, and face. Additional analyses of studies of static faces or body stimuli and sub-analyses grouping experiments as a function of their control stimuli or task employed allowed us to identify main effects of movements and forms perception, as well as effects of task demand. In addition to specific features, all conditions showed convergence in occipito-temporal and fronto-parietal regions, but with different peak location and extent. The conjunction of the three ALE maps revealed convergence in all categories in a region of the right posterior superior temporal sulcus as well as in a bilateral region at the junction between middle temporal and lateral occipital gyri. Activation in these regions was not a function of attentional demand and was significant also when controlling for non-specific motion perception. This quantitative synthesis points towards a special role for posterior superior temporal sulcus for integrating human movement percept, and supports a specific representation for body parts in middle temporal, fusiform, precentral, and parietal areas.

  19. Genetic evolution, plasticity, and bet-hedging as adaptive responses to temporally autocorrelated fluctuating selection: A quantitative genetic model.

    PubMed

    Tufto, Jarle

    2015-08-01

    Adaptive responses to autocorrelated environmental fluctuations through evolution in mean reaction norm elevation and slope and an independent component of the phenotypic variance are analyzed using a quantitative genetic model. Analytic approximations expressing the mutual dependencies between all three response modes are derived and solved for the joint evolutionary outcome. Both genetic evolution in reaction norm elevation and plasticity are favored by slow temporal fluctuations, with plasticity, in the absence of microenvironmental variability, being the dominant evolutionary outcome for reasonable parameter values. For fast fluctuations, tracking of the optimal phenotype through genetic evolution and plasticity is limited. If residual fluctuations in the optimal phenotype are large and stabilizing selection is strong, selection then acts to increase the phenotypic variance (bet-hedging adaptive). Otherwise, canalizing selection occurs. If the phenotypic variance increases with plasticity through the effect of microenvironmental variability, this shifts the joint evolutionary balance away from plasticity in favor of genetic evolution. If microenvironmental deviations experienced by each individual at the time of development and selection are correlated, however, more plasticity evolves. The adaptive significance of evolutionary fluctuations in plasticity and the phenotypic variance, transient evolution, and the validity of the analytic approximations are investigated using simulations. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.

  20. Markov chain Monte Carlo linkage analysis of complex quantitative phenotypes.

    PubMed

    Hinrichs, A; Reich, T

    2001-01-01

    We report a Markov chain Monte Carlo analysis of the five simulated quantitative traits in Genetic Analysis Workshop 12 using the Loki software. Our objectives were to determine the efficacy of the Markov chain Monte Carlo method and to test a new scoring technique. Our initial blind analysis, on replicate 42 (the "best replicate") successfully detected four out of the five disease loci and found no false positives. A power analysis shows that the software could usually detect 4 of the 10 trait/gene combinations at an empirical point-wise p-value of 1.5 x 10(-4).

  1. Epilepsy surgery failure in children: a quantitative and qualitative analysis

    PubMed Central

    Englot, Dario J.; Han, Seunggu J.; Rolston, John D.; Ivan, Michael E.; Kuperman, Rachel A.; Chang, Edward F.; Gupta, Nalin; Sullivan, Joseph E.; Auguste, Kurtis I.

    2015-01-01

    Object Resection is a safe and effective treatment option for children with pharmacoresistant focal epilepsy, but some patients continue experience seizures after surgery. While most studies of pediatric epilepsy surgery focus on predictors of postoperative seizure outcome, these factors are often not modifiable, and the reasons for surgical failure may remain unclear. Methods The authors performed a retrospective cohort study of children and adolescents who received focal resective surgery for pharmacoresistant epilepsy. Both quantitative and qualitative analyses of factors associated with persistent postoperative seizures were conducted. Results Records were reviewed from 110 patients, ranging in age from 6 months to 19 years at the time of surgery, who underwent a total of 115 resections. At a mean 3.1-year follow-up, 76% of patients were free of disabling seizures (Engel Class I outcome). Seizure freedom was predicted by temporal lobe surgery compared with extra-temporal resection, tumor or mesial temporal sclerosis compared with cortical dysplasia or other pathologies, and by a lower preoperative seizure frequency. Factors associated with persistent seizures (Engel Class II–IV outcome) included residual epileptogenic tissue adjacent to the resection cavity (40%), an additional epileptogenic zone distant from the resection cavity (32%), and the presence of a hemispheric epilepsy syndrome (28%). Conclusions While seizure outcomes in pediatric epilepsy surgery may be improved by the use of high-resolution neuroimaging and invasive electrographic studies, a more aggressive resection should be considered in certain patients, including hemispherectomy if a hemispheric epilepsy syndrome is suspected. Family counseling regarding treatment expectations is critical, and reoperation may be warranted in select cases. PMID:25127098

  2. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  3. Quantitative Rietveld analysis of CAC clinker phases using synchrotron radiation

    SciTech Connect

    Guirado, F. . E-mail: francesc.guirado@urv.cat; Gali, S.

    2006-11-15

    The quantitative Rietveld analyses of twenty samples of CAC from four different manufacturers over the world, one synthetic mixture and a NIST standard were performed using synchrotron radiation. As compared with conventional XRD, synchrotron powder diffraction permitted to find new minor phases, improve the characterization of solid solutions of iron rich CAC phases and reduce preferential orientation and microabsorption effects. Diffraction data were complemented with XRF and TG/DT analyses. Synchrotron results were used as a reference test to improve the performance of conventional powder diffraction, by an accurate selection of refinable profile and structural parameters, and permitted to extract several recommendations for conventional quantitative Rietveld procedures. It is shown that with these recommendations in mind, conventional XRD based Rietveld analyses are comparable to those obtained from synchrotron data. In summary, quantitative XRD Rietveld analysis is confirmed as an excellent tool for the CAC cement industry.

  4. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  5. FMRI signals associated with memory strength in the medial temporal lobes: a meta-analysis.

    PubMed

    Wais, Peter E

    2008-12-01

    To identify patterns of memory-related neural activity in the medial temporal lobes (MTL), a quantitative meta-analysis of 17 functional magnetic resonance imaging (fMRI) studies was performed. The analysis shows that increased activity in the hippocampus and the parahippocampal cortex predicts subsequent memory strength. During retrieval, activity in the hippocampus increases in association with strong memory. In the perirhinal cortex, increased activity predicts subsequent recognition, whether based on weak or strong memory, whereas during retrieval activity decreases below the level for misses in association with both weak and strong memory. The results are consistent with the claim that the hippocampus selectively subserves recollection, whereas adjacent structures subserve familiarity [Eichenbaum, H., Yonelinas, A., & Ranganath, C. (2007). The medial temporal lobe and recognition memory. The Annual Review of Neuroscience, 30, 123-152]. However, this conclusion depends on a specific dual-process theory of recognition memory that has been used to interpret the results. An alternative dual-process model holds that the behavioral methods used to differentiate recollection from familiarity instead separate strong memories from weak memories. When the fMRI data are interpreted in terms of the alternative theory, the fMRI results do not point to selective roles for the hippocampus or the adjacent MTL structures. The fMRI data alone cannot distinguish between these two models, so other methods are needed to resolve the issue.

  6. Cognitive control for language switching in bilinguals: A quantitative meta-analysis of functional neuroimaging studies

    PubMed Central

    Luk, Gigi; Green, David W.; Abutalebi, Jubin; Grady, Cheryl

    2013-01-01

    In a quantitative meta-analysis, using the activation likelihood estimation method, we examined the neural regions involved in bilingual cognitive control, particularly when engaging in switching between languages. The purpose of this study was to evaluate the bilingual cognitive control model based on a qualitative analysis [Abutalebi, J., & Green, D. W. (2008). Control mechanisms in bilingual language production: Neural evidence from language switching studies. Language and Cognitive Processes, 23, 557–582.]. After reviewing 128 peer-reviewed articles, ten neuroimaging studies met our inclusion criteria and in each study, bilinguals switched between languages in response to cues. We isolated regions involved in voluntary language switching, by including reported contrasts between the switching conditions and high level baseline conditions involving similar tasks but requiring the use of only one language. Eight brain regions showed significant and reliable activation: left inferior frontal gyrus, left middle temporal gyrus, left middle frontal gyrus, right precentral gyrus, right superior temporal gyrus, midline pre-SMA and bilateral caudate nuclei. This quantitative result is consistent with bilingual aphasia studies that report switching deficits associated with lesions to the caudate nuclei or prefrontal cortex. It also extends the previously reported qualitative model. We discuss the implications of the findings for accounts of bilingual cognitive control. PMID:24795491

  7. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  8. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  9. Study of Discussion Record Analysis Using Temporal Data Crystallization and Its Application to TV Scene Analysis

    DTIC Science & Technology

    2015-03-31

    Crystallization and Its Application to TV Scene Analysis 5a. CONTRACT NUMBER FA2386-13-1-4044 5b. GRANT NUMBER Grant AOARD-134044 5c...methods – Scene analysis and Logical analysis. Scene analysis focuses on topic change in the record and divides a record into several scenes using...Temporal Data Crystallization (TDC). Then, by measuring the polarity of words and the volume of utterance, the atmosphere and emotion in the scene were

  10. Temporal radiographic texture analysis in the detection of periprosthetic osteolysis

    SciTech Connect

    Wilkie, Joel R.; Giger, Maryellen L.; Chinander, Michael R.; Engh, Charles A. Sr.; Hopper, Robert H. Jr.; Martell, John M.

    2008-01-15

    Periprosthetic osteolysis is one of the most serious long-term problems in total hip arthroplasty. It has been primarily attributed to the body's inflammatory response to submicron polyethylene particles worn from the hip implant, and it leads to bone loss and structural deterioration in the surrounding bone. It was previously demonstrated that radiographic texture analysis (RTA) has the ability to distinguish between osteolysis and normal cases at the time of clinical detection of the disease; however, that analysis did not take into account the changes in texture over time. The goal of this preliminary analysis, however, is to assess the ability of temporal radiographic texture analysis (tRTA) to distinguish between patients who develop osteolysis and normal cases. Two tRTA methods were used in the study: the RTA feature change from baseline at various follow-up intervals and the slope of the best-fit line to the RTA data series. These tRTA methods included Fourier-based and fractal-based features calculated from digitized images of 202 total hip replacement cases, including 70 that developed osteolysis. Results show that separation between the osteolysis and normal groups increased over time for the feature difference method, as the disease progressed, with area under the curve (AUC) values from receiver operating characteristic analysis of 0.65 to 0.72 at 15 years postsurgery. Separation for the slope method was also evident, with AUC values ranging from 0.65 to 0.76 for the task of distinguishing between osteolysis and normal cases. The results suggest that tRTA methods have the ability to measure changes in trabecular structure, and may be useful in the early detection of periprosthetic osteolysis.

  11. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  12. Local and global analysis of endocytic patch dynamics in fission yeast using a new “temporal superresolution” realignment method

    PubMed Central

    Berro, Julien; Pollard, Thomas D.

    2014-01-01

    Quantitative microscopy is a valuable tool for inferring molecular mechanisms of cellular processes such as clathrin-mediated endocytosis, but, for quantitative microscopy to reach its potential, both data collection and analysis needed improvement. We introduce new tools to track and count endocytic patches in fission yeast to increase the quality of the data extracted from quantitative microscopy movies. We present a universal method to achieve “temporal superresolution” by aligning temporal data sets with higher temporal resolution than the measurement intervals. These methods allowed us to extract new information about endocytic actin patches in wild-type cells from measurements of the fluorescence of fimbrin-mEGFP. We show that the time course of actin assembly and disassembly varies <600 ms between patches. Actin polymerizes during vesicle formation, but we show that polymerization does not participate in vesicle movement other than to limit the complex diffusive motions of newly formed endocytic vesicles, which move faster as the surrounding actin meshwork decreases in size over time. Our methods also show that the number of patches in fission yeast is proportional to cell length and that the variability in the repartition of patches between the tips of interphase cells has been underestimated. PMID:25143395

  13. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    and A. P. Sistla . Automatic verification of finite-state concurrent systems using temporal logic specifications. ACM TOPLAS, 8(2):244–263, 1986. [6] E...Proceedings of the 11th CHDL, 1993. [7] E. A. Emerson, A. K. Mok, A. P. Sistla , and J. Srinivasan. Quantitative temporal reasoning. In Lecture Notes in

  14. Temporal instability analysis of inviscid compound jets falling under gravity

    NASA Astrophysics Data System (ADS)

    Mohsin, Muhammad; Uddin, Jamal; Decent, Stephen P.; Afzaal, Muhammad F.

    2013-01-01

    Compound liquid jets can be used in a variety of industrial applications ranging from capsule production in pharmaceutics to enhance printing methods in ink-jet printing. An appreciation of how instability along compound jets can lead to breakup and droplet formation is thus critical in many fields in science and engineering. In this paper, we perform a theoretical analysis to examine the instability of an axisymmetric inviscid compound liquid jet which falls vertically under the influence of gravity. We use a long-wavelength, slender-jet asymptotic expansion to reduce the governing equations of the problem into a set of one-dimensional partial differential equations, which describe the evolution of the leading-order axial velocity of the jet as well as the radii of both the inner and the outer interfaces. We first determine the steady-state solutions of the one-dimensional model equations and then we perform a linear temporal instability analysis to obtain a dispersion relation, which gives us useful information about the maximum growth rate and the maximum wavenumber of the imposed wave-like disturbance. We use our results to estimate the location and qualitative nature of breakup and then compare our results with numerical simulations.

  15. Multi Temporal InSAR Analysis in Northwestern Tibet

    NASA Astrophysics Data System (ADS)

    Daout, S.; Doin, M.-P.; Lasserre, C.; Peltzer, Gilles; Socquet, Anne

    2016-08-01

    We use Multi-Temporal Interferometric Synthetic Aperture Radar (MT-InSAR) to study hydrological and intracontinental deformation in northwestern Tibet, between longitudes 82.5°E and 87°. We process the complete Envisat archive data along three 800 km-long tracks extending from the Tarim basin to the central part of the Tibetan plateau. Interferograms are processed through a time series analysis with the NSBAS chain to provide average velocity maps. We face two main processing challenges. The first one is related to the freeze-thaw cycles of the sedimentary basins on the highest part of the plateau. This seasonal signal is analyzed and extracted through a principal component analysis, which improves phase unwrapping across high fringe rates areas. The second challenge is to correct atmospheric delays, enhanced by the 3 km topographic step from the Tarim basin to the Tibetan plateau. We analyze the atmospheric signal across the topographic feature and assess errors to improve the time series inversion and better discriminate tectonic deformation from seasonal signals. Velocity maps highlight the interaction between active thrusts in the Tarim basin and the left-lateral Altyn Tagh fault, along the western end of the Kunlun chain.

  16. A meta-analysis on progressive atrophy in intractable temporal lobe epilepsy: Time is brain?

    PubMed

    Caciagli, Lorenzo; Bernasconi, Andrea; Wiebe, Samuel; Koepp, Matthias J; Bernasconi, Neda; Bernhardt, Boris C

    2017-08-01

    It remains unclear whether drug-resistant temporal lobe epilepsy (TLE) is associated with cumulative brain damage, with no expert consensus and no quantitative syntheses of the available evidence. We conducted a systematic review and meta-analysis of MRI studies on progressive atrophy, searching PubMed and Ovid MEDLINE databases for cross-sectional and longitudinal quantitative MRI studies on drug-resistant TLE. We screened 2,976 records and assessed eligibility of 248 full-text articles. Forty-two articles met the inclusion criteria for quantitative evaluation. We observed a predominance of cross-sectional studies, use of different clinical indices of progression, and high heterogeneity in age-control procedures. Meta-analysis of 18/1 cross-sectional/longitudinal studies on hippocampal atrophy (n = 979 patients) yielded a pooled effect size of r = -0.42 for ipsilateral atrophy related to epilepsy duration (95% confidence interval [CI] -0.51 to -0.32; p < 0.0001; I(2) = 65.22%) and r = -0.35 related to seizure frequency (95% CI -0.47 to -0.22; p < 0.0001; I(2) = 61.97%). Sensitivity analyses did not change the results. Narrative synthesis of 25/3 cross-sectional/longitudinal studies on whole brain atrophy (n = 1,504 patients) indicated that >80% of articles reported duration-related progression in extratemporal cortical and subcortical regions. Detailed analysis of study design features yielded low to moderate levels of evidence for progressive atrophy across studies, mainly due to dominance of cross-sectional over longitudinal investigations, use of diverse measures of seizure estimates, and absence of consistent age control procedures. While the neuroimaging literature is overall suggestive of progressive atrophy in drug-resistant TLE, published studies have employed rather weak designs to directly demonstrate it. Longitudinal multicohort studies are needed to unequivocally differentiate aging from disease progression. © 2017 American Academy of Neurology.

  17. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  18. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  19. An improved quantitative analysis method for plant cortical microtubules.

    PubMed

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  20. Quantitative analysis of synchrotron radiation intravenous angiographic images

    NASA Astrophysics Data System (ADS)

    Sarnelli, Anna; Nemoz, Christian; Elleaume, Hélène; Estève, François; Bertrand, Bernard; Bravin, Alberto

    2005-02-01

    A medical research protocol on clinical intravenous coronary angiography has been completed at the European Synchrotron Radiation Facility (ESRF) biomedical beamline. The aim was to investigate the accuracy of intravenous coronary angiography based on the K-edge digital subtraction technique for the detection of in-stent restenosis. For each patient, diagnosis has been performed on the synchrotron radiation images and monitored with the conventional selective coronary angiography method taken as the golden standard. In this paper, the methods of image processing and the results of the quantitative analysis are described. Image processing includes beam harmonic contamination correction, spatial deconvolution and the extraction of a 'contrast' and a 'tissue' image from each couple of radiograms simultaneously acquired at energies bracketing the K-edge of iodine. Quantitative analysis includes the estimation of the vessel diameter, the calculation of the absolute iodine concentration profiles along the coronary arteries and the stenosis degree measurement.

  1. Temporal Expectation and Information Processing: A Model-Based Analysis

    ERIC Educational Resources Information Center

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  2. Temporal Expectation and Information Processing: A Model-Based Analysis

    ERIC Educational Resources Information Center

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  3. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    future MAC-enabled systems. A human-computer interaction ( HCI ) Index, originally applied to multi-function displays was applied to the prototype Vigilant...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...two modified interface designs. The modified HCI Index incorporates the Hick-Hyman decision time, Fitts’ Law time, and the physical actions

  4. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  5. Quantitative and temporal differential recovery of articular and muscular limitations of knee joint contractures; results in a rat model.

    PubMed

    Trudel, Guy; Laneuville, Odette; Coletta, Elizabeth; Goudreau, Louis; Uhthoff, Hans K

    2014-10-01

    Joint contractures alter the mechanical properties of articular and muscular structures. Reversibility of a contracture depends on the restoration of the elasticity of both structures. We determined the differential contribution of articular and muscular structures to knee flexion contractures during spontaneous recovery. Rats (250, divided into 24 groups) had one knee joint surgically fixed in flexion for six different durations, from 1 to 32 wk, creating joint contractures of various severities. After the fixation was removed, the animals were left to spontaneously recover for 1 to 48 wk. After the recovery periods, animals were killed and the knee extension was measured before and after division of the transarticular posterior muscles using a motorized arthrometer. No articular limitation had developed in contracture of recent onset (≤2 wk of fixation, P > 0.05); muscular limitations were responsible for the majority of the contracture (34 ± 8° and 38 ± 6°, respectively; both P < 0.05). Recovery for 1 and 8 wk reversed the muscular limitation of contractures of recent onset (1 and 2 wk of fixation, respectively). Long-lasting contractures (≥4 wk of fixation) presented articular limitations, irreversible in all 12 durations of recovery compared with controls (all 12 P < 0.05). Knee flexion contractures of recent onset were primarily due to muscular structures, and they were reversible during spontaneous recovery. Long-lasting contractures were primarily due to articular structures and were irreversible. Comprehensive temporal and quantitative data on the differential reversibility of mechanically significant alterations in articular and muscular structures represent novel evidence on which to base clinical practice. Copyright © 2014 the American Physiological Society.

  6. a Temporal and Spatial Analysis of Urban Heat Island in Basin City Utilizing Remote Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Chang, Hsiao-Tung

    2016-06-01

    Urban Heat Island (UHI) has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city's UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST) at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls "sinking heat island". From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  7. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  8. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-04

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  9. Quantitative analysis of the heterogeneous population of endocytic vesicles.

    PubMed

    Kozlov, Konstantin; Kosheverova, Vera; Kamentseva, Rimma; Kharchenko, Marianna; Sokolkova, Alena; Kornilova, Elena; Samsonova, Maria

    2017-03-07

    The quantitative characterization of endocytic vesicles in images acquired with microscope is critically important for deciphering of endocytosis mechanisms. Image segmentation is the most important step of quantitative image analysis. In spite of availability of many segmentation methods, the accurate segmentation is challenging when the images are heterogeneous with respect to object shapes and signal intensities what is typical for images of endocytic vesicles. We present a Morphological reconstruction and Contrast mapping segmentation method (MrComas) for the segmentation of the endocytic vesicle population that copes with the heterogeneity in their shape and intensity. The method uses morphological opening and closing by reconstruction in the vicinity of local minima and maxima respectively thus creating the strong contrast between their basins of attraction. As a consequence, the intensity is flattened within the objects and their edges are enhanced. The method accurately recovered quantitative characteristics of synthetic images that preserve characteristic features of the endocytic vesicle population. In benchmarks and quantitative comparisons with two other popular segmentation methods, namely manual thresholding and Squash plugin, MrComas shows the best segmentation results on real biological images of EGFR (Epidermal Growth Factor Receptor) endocytosis. As a proof of feasibility, the method was applied to quantify the dynamical behavior of Early Endosomal Autoantigen 1 (EEA1)-positive endosome subpopulations during EGF-stimulated endocytosis.

  10. Multi-Temporal Analysis of WWII Reconnaissance Photos

    NASA Astrophysics Data System (ADS)

    Meixner, P.; Eckstein, M.

    2016-06-01

    There are millions of aerial photographs from the period of the Second Wold War available in the Allied archives, obtained by aerial photo reconnaissance, covering most of today's European countries. They are spanning the time from 1938 until the end of the war and even beyond. Photo reconnaissance provided intelligence information for the Allied headquarters and accompanied the bombing offensive against the German homeland and the occupied territories. One of the initial principal targets in Bohemia were the synthetized fuel works STW AG (Sudetenländische Treibstoffwerke AG) in Zaluzi (formerly Maltheuren) near Most (formerly Brück), Czech Republic. The STW AG synthetized fuel plant was not only subject to bombing raids, but a subject to quite intensive photo reconnaissance, too - long before the start of the bombing campaign. With a multi-temporal analysis of the available imagery from international archives we will demonstrate the factory build-up during 1942 and 1943, the effects of the bombing raids in 1944 and the struggle to keep the plant working in the last year of the war. Furthermore we would like to show the impact the bombings have today, in form of potential unexploded ordnance in the adjacent area of the open cast mines.

  11. A Spatial and Temporal Analysis of Upward Triggered Lightning

    NASA Astrophysics Data System (ADS)

    Ballweber, A. J.

    2013-12-01

    Alana Ballweber, John H. Helsdon Jr., and Tom A. Warner South Dakota School of Mines and Technology Ten tall communication towers lining the ridge in Rapid City, South Dakota provide a unique opportunity to study the phenomenon of lightning-triggered upward lightning. The Upward Lightning Triggering Study (UPLIGHTS), seeks to determine if upward positive leaders are triggered from these towers by: (1) the approach of horizontally propagating negative stepped leaders associated with either intracloud development or following a positive cloud-to-ground (+CG) return stroke, and/or (2) a +CG return stroke as it propagates through a previously formed leader network near the towers. As part of the UPLIGHTS research, two separate lightning mapping devices were used to aid in a 3D re-creation of the triggering flash, a 3D digital interferometer and a Lightning Mapping Array. Through the use of these two devices, we present findings founded on the analysis of data collected from these assets during the 2013 storm season. Specifically, we quantify the spatial and temporal relationship of the triggering flash leader activity relative to the tall objects when upward leaders develop and when upward leaders fail to develop. Furthermore, the lightning mapping devices were correlated with high-speed optical and electrical field observations to provide a further insight as to why certain flashes trigger upward lightning from tall structures and others do not.

  12. [Analysis of fusion waves created with temporal pacing].

    PubMed

    Ueda, M; Shigemi, K; Hayashi, K; Kakihara, K; Nakajima, Y; Fukushima, H; Tanaka, Y

    1998-08-01

    Fifty-five beats of fusion waves were recorded continuously with an audio digital tape and the tape was re-played for analysis. A 45-year-old male (56 kg, 175 cm) with cervical spondylosis was scheduled to undergo laminoplasty of the cervical vertebral (C2-C6). A temporal ventricular (VVI mode) pacing lead was inserted from the right cubital vein to the right ventricular apex for preventing bradycardia while manipulating the medulla. The height of the R wave decreased gradually and the depth of S wave increased in the earlier period of fusion beats and it was reversed later. The narrow QRS width indicated that the electrode was placed near the cardiac conducting system. The gradually increasing intervals between P waves activated the pacing, and the P wave intervals recovered inhibiting the pacing. During the recovery phase, some beats were still activated by pacing instead of depressing the rate below the original rate. These beats suggest the importance of considering the atrial-ventricular conducting time. Arterial pressure fluctuated only slightly during the 'fusion beats, suggesting that despite the abnormality in the cardiac conduction system due to pacing, contraction of the ventricular muscles was only slightly affected in this case.

  13. Right ventricular strain analysis from three-dimensional echocardiography by using temporally diffeomorphic motion estimation

    PubMed Central

    Zhu, Meihua; Ashraf, Muhammad; Broberg, Craig S.; Sahn, David J.; Song, Xubo

    2014-01-01

    Purpose: Quantitative analysis of right ventricle (RV) motion is important for study of the mechanism of congenital and acquired diseases. Unlike left ventricle (LV), motion estimation of RV is more difficult because of its complex shape and thin myocardium. Although attempts of finite element models on MR images and speckle tracking on echocardiography have shown promising results on RV strain analysis, these methods can be improved since the temporal smoothness of the motion is not considered. Methods: The authors have proposed a temporally diffeomorphic motion estimation method in which a spatiotemporal transformation is estimated by optimization of a registration energy functional of the velocity field in their earlier work. The proposed motion estimation method is a fully automatic process for general image sequences. The authors apply the method by combining with a semiautomatic myocardium segmentation method to the RV strain analysis of three-dimensional (3D) echocardiographic sequences of five open-chest pigs under different steady states. Results: The authors compare the peak two-point strains derived by their method with those estimated from the sonomicrometry, the results show that they have high correlation. The motion of the right ventricular free wall is studied by using segmental strains. The baseline sequence results show that the segmental strains in their methods are consistent with results obtained by other image modalities such as MRI. The image sequences of pacing steady states show that segments with the largest strain variation coincide with the pacing sites. Conclusions: The high correlation of the peak two-point strains of their method and sonomicrometry under different steady states demonstrates that their RV motion estimation has high accuracy. The closeness of the segmental strain of their method to those from MRI shows the feasibility of their method in the study of RV function by using 3D echocardiography. The strain analysis of the

  14. Right ventricular strain analysis from three-dimensional echocardiography by using temporally diffeomorphic motion estimation.

    PubMed

    Zhang, Zhijun; Zhu, Meihua; Ashraf, Muhammad; Broberg, Craig S; Sahn, David J; Song, Xubo

    2014-12-01

    Quantitative analysis of right ventricle (RV) motion is important for study of the mechanism of congenital and acquired diseases. Unlike left ventricle (LV), motion estimation of RV is more difficult because of its complex shape and thin myocardium. Although attempts of finite element models on MR images and speckle tracking on echocardiography have shown promising results on RV strain analysis, these methods can be improved since the temporal smoothness of the motion is not considered. The authors have proposed a temporally diffeomorphic motion estimation method in which a spatiotemporal transformation is estimated by optimization of a registration energy functional of the velocity field in their earlier work. The proposed motion estimation method is a fully automatic process for general image sequences. The authors apply the method by combining with a semiautomatic myocardium segmentation method to the RV strain analysis of three-dimensional (3D) echocardiographic sequences of five open-chest pigs under different steady states. The authors compare the peak two-point strains derived by their method with those estimated from the sonomicrometry, the results show that they have high correlation. The motion of the right ventricular free wall is studied by using segmental strains. The baseline sequence results show that the segmental strains in their methods are consistent with results obtained by other image modalities such as MRI. The image sequences of pacing steady states show that segments with the largest strain variation coincide with the pacing sites. The high correlation of the peak two-point strains of their method and sonomicrometry under different steady states demonstrates that their RV motion estimation has high accuracy. The closeness of the segmental strain of their method to those from MRI shows the feasibility of their method in the study of RV function by using 3D echocardiography. The strain analysis of the pacing steady states shows the potential

  15. Spatio-Temporal Analysis of Suicide-Related Emergency Calls

    PubMed Central

    Gracia, Enrique; Lila, Marisol

    2017-01-01

    Considerable effort has been devoted to incorporate temporal trends in disease mapping. In this line, this work describes the importance of including the effect of the seasonality in a particular setting related with suicides. In particular, the number of suicide-related emergency calls is modeled by means of an autoregressive approach to spatio-temporal disease mapping that allows for incorporating the possible interaction between both temporal and spatial effects. Results show the importance of including seasonality effect, as there are differences between the number of suicide-related emergency calls between the four seasons of each year. PMID:28684714

  16. Spatio-Temporal Analysis of Suicide-Related Emergency Calls.

    PubMed

    Marco, Miriam; López-Quílez, Antonio; Conesa, David; Gracia, Enrique; Lila, Marisol

    2017-07-06

    Considerable effort has been devoted to incorporate temporal trends in disease mapping. In this line, this work describes the importance of including the effect of the seasonality in a particular setting related with suicides. In particular, the number of suicide-related emergency calls is modeled by means of an autoregressive approach to spatio-temporal disease mapping that allows for incorporating the possible interaction between both temporal and spatial effects. Results show the importance of including seasonality effect, as there are differences between the number of suicide-related emergency calls between the four seasons of each year.

  17. Spatial-temporal analysis of fetal bio-magnetic signals.

    PubMed

    Soni, P; Chan, Y; Eswaran, H; Wilson, J D; Murphy, P; Lowery, C L

    2007-05-15

    Non-invasive technique such as magneto-encephalography (MEG), initially pioneered to study human brain signals, has found many other applications in medicine. SQUID(1) Array for Reproductive Assessment (SARA) is a unique non-invasive scanning-device developed at the University of Arkansas for Medical Sciences (UAMS) that can detect fetal brain and other signals. The fetal magneto-encephalography (fMEG) signals often have many bio-magnetic signals mixed in. Examples include the movement of the fetus or muscle contraction of the mother. As a result, the recorded signals may show unexpected patterns, other than the target signal of interest. These "interventions" make it difficult for a physician to assess the exact fetal condition, including its response to various stimuli. We propose using intervention analysis and spatial-temporal auto-regressive moving-average (STARMA) modeling to address the problem. STARMA is a statistical method that examines the relationship between the current observations as a linear combination of past observations as well as observations at neighboring sensors. Through intervention analysis, the change in a pattern due to "interfering" signals can be accounted for. When these interferences are "removed," the end product is a "template" time series, or a typical signal from the target of interest. In this research, a "universal" template is obtained. The template is then used to detect intervention in other datasets by the method of template matching. By this method, it is possible to detect if there is an intervention in any dataset. It will assist physicians in monitoring the actual signal generated by fetal brain and other organs of interest.

  18. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    PubMed

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  19. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    PubMed

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  20. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  1. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-04-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells.

  2. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed Central

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-01-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells. PMID:2720066

  3. Quantitative Assessment of Spatio-Temporal Desertification Rates in Azerbaijan during Using Timeseries Landsat-8 Satellite Images

    NASA Astrophysics Data System (ADS)

    Bayramov, Emil; Mammadov, Ramiz

    2016-07-01

    The main goals of this research are the object-based landcover classification of LANDSAT-8 multi-spectral satellite images in 2014 and 2015, quantification of Normalized Difference Vegetation Indices (NDVI) rates within the land-cover classes, change detection analysis between the NDVIs derived from multi-temporal LANDSAT-8 satellite images and the quantification of those changes within the land-cover classes and detection of changes between land-cover classes. The object-based classification accuracy of the land-cover classes was validated through the standard confusion matrix which revealed 80 % of land-cover classification accuracy for both years. The analysis revealed that the area of agricultural lands increased from 30911 sq. km. in 2014 to 31999 sq. km. in 2015. The area of barelands increased from 3933 sq. km. in 2014 to 4187 sq. km. in 2015. The area of forests increased from 8211 sq. km. in 2014 to 9175 sq. km. in 2015. The area of grasslands decreased from 27176 sq. km. in 2014 to 23294 sq. km. in 2015. The area of urban areas increased from 12479 sq. km. in 2014 to 12956 sq. km. in 2015. The decrease in the area of grasslands was mainly explained by the landuse shifts of grasslands to agricultural and urban lands. The quantification of low and medium NDVI rates revealed the increase within the agricultural, urban and forest land-cover classes in 2015. However, the high NDVI rates within agricultural, urban and forest land-cover classes in 2015 revealed to be lower relative to 2014. The change detection analysis between landscover types of 2014 and 2015 allowed to determine that 7740 sq. km. of grasslands shifted to agricultural landcover type whereas 5442sq. km. of agricultural lands shifted to rangelands. This means that the spatio-temporal patters of agricultural activities occurred in Azerbaijan because some of the areas reduced agricultural activities whereas some of them changed their landuse type to agricultural. Based on the achieved results, it

  4. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  5. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  6. Time-course analysis of temporal preparation on central processes.

    PubMed

    Leonhard, Tanja; Bratzke, Daniel; Schröter, Hannes; Ulrich, Rolf

    2012-03-01

    Participants usually respond faster to a response signal (RS) when this signal is preceded by a warning stimulus than when it is not. A question of theoretical importance is the locus of this facilitating effect within the information processing stream. Recently, Los and Schut (Cogn Psychol 57:20-55, 2008) suggested that temporal preparation acts on central processes while perception of the RS is under way. The present study provides a stochastic model (central preparation model, CPM) based on this hypothesis and presents three experiments testing this model. To track the complete time-course of temporal preparation, the warning signal could either precede or follow the RS. The data show some systematic deviation from the model's predictions, questioning CPM's assumption that temporal preparation acts only on central processes. An alternative mechanism of temporal preparation based on the parallel grains model [Miller and Ulrich (Cogn Psychol 46:101-151, 2003)] is discussed.

  7. Quantitative analysis of endocytosis with cytoplasmic pHluorin chimeras.

    PubMed

    Prosser, Derek C; Whitworth, Karen; Wendland, Beverly

    2010-09-01

    The pH-sensitive green fluorescent protein (GFP) variant pHluorin is typically fused to the extracellular domain of transmembrane proteins to monitor endocytosis. Here, we have turned pHluorin inside-out, and show that cytoplasmic fusions of pHluorin are effective quantitative reporters for endocytosis and multivesicular body (MVB) sorting. In yeast in particular, fusion of GFP and its variants on the extracellular side of transmembrane proteins can result in perturbed trafficking. In contrast, cytoplasmic fusions are well tolerated, allowing for the quantitative assessment of trafficking of virtually any transmembrane protein. Quenching of degradation-resistant pHluorin in the acidic vacuole permits quantification of extravacuolar cargo proteins at steady-state levels and is compatible with kinetic analysis of endocytosis in live cells.

  8. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis

    PubMed Central

    Morelli, Sylvia A.; Sacchet, Matthew D.; Zaki, Jamil

    2015-01-01

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  9. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis.

    PubMed

    Morelli, Sylvia A; Sacchet, Matthew D; Zaki, Jamil

    2015-05-15

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. [Simultaneous quantitative analysis of four lignanoids in Schisandra chinensis by quantitative analysis of multi-components by single marker].

    PubMed

    He, Feng-Cheng; Li, Shou-Xin; Zhao, Zhi-Quan; Dong, Jin-Ping; Liu, Wu-Zhan; Su, Rui-Qiang

    2012-07-01

    The aim of the study is to establish a new method of quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four lignanoids in Schisandra chinensis. A new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with Schisandra chinensis. Four main lignanoids, schisandrin, schisantherin A, deoxyschizandrin and gamma-schizandrin, were selected as analytes and schisandrin as internal reference substance to evaluate the quality. Their contents in 13 different batches of samples, collected from different bathes, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of four lignanoids in 13 batches of S. chinensis determined by external standard method and QAMS. QAMS is feasible for determination of four lignanoids simultaneously when some authentic standard substances were unavailable, and the developed method can be used for quality control of S. chinensis.

  11. Qualitative and temporal reasoning in engine behavior analysis

    NASA Technical Reports Server (NTRS)

    Dietz, W. E.; Stamps, M. E.; Ali, M.

    1987-01-01

    Numerical simulation models, engine experts, and experimental data are used to generate qualitative and temporal representations of abnormal engine behavior. Engine parameters monitored during operation are used to generate qualitative and temporal representations of actual engine behavior. Similarities between the representations of failure scenarios and the actual engine behavior are used to diagnose fault conditions which have already occurred, or are about to occur; to increase the surveillance by the monitoring system of relevant engine parameters; and to predict likely future engine behavior.

  12. Qualitative and temporal reasoning in engine behavior analysis

    NASA Technical Reports Server (NTRS)

    Dietz, W. E.; Stamps, M. E.; Ali, M.

    1987-01-01

    Numerical simulation models, engine experts, and experimental data are used to generate qualitative and temporal representations of abnormal engine behavior. Engine parameters monitored during operation are used to generate qualitative and temporal representations of actual engine behavior. Similarities between the representations of failure scenarios and the actual engine behavior are used to diagnose fault conditions which have already occurred, or are about to occur; to increase the surveillance by the monitoring system of relevant engine parameters; and to predict likely future engine behavior.

  13. Quantitative 3D analysis of huge nanoparticle assemblies

    NASA Astrophysics Data System (ADS)

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Bals, Sara; van Tendeloo, Gustaaf

    2015-12-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed.Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques

  14. Quantitative MRI for analysis of peritumoral edema in malignant gliomas

    PubMed Central

    Warntjes, J. B. Marcel; Smedby, Örjan; Lundberg, Peter

    2017-01-01

    Background and purpose Damage to the blood-brain barrier with subsequent contrast enhancement is a hallmark of glioblastoma. Non-enhancing tumor invasion into the peritumoral edema is, however, not usually visible on conventional magnetic resonance imaging. New quantitative techniques using relaxometry offer additional information about tissue properties. The aim of this study was to evaluate longitudinal relaxation R1, transverse relaxation R2, and proton density in the peritumoral edema in a group of patients with malignant glioma before surgery to assess whether relaxometry can detect changes not visible on conventional images. Methods In a prospective study, 24 patients with suspected malignant glioma were examined before surgery. A standard MRI protocol was used with the addition of a quantitative MR method (MAGIC), which measured R1, R2, and proton density. The diagnosis of malignant glioma was confirmed after biopsy/surgery. In 19 patients synthetic MR images were then created from the MAGIC scan, and ROIs were placed in the peritumoral edema to obtain the quantitative values. Dynamic susceptibility contrast perfusion was used to obtain cerebral blood volume (rCBV) data of the peritumoral edema. Voxel-based statistical analysis was performed using a mixed linear model. Results R1, R2, and rCBV decrease with increasing distance from the contrast-enhancing part of the tumor. There is a significant increase in R1 gradient after contrast agent injection (P < .0001). There is a heterogeneous pattern of relaxation values in the peritumoral edema adjacent to the contrast-enhancing part of the tumor. Conclusion Quantitative analysis with relaxometry of peritumoral edema in malignant gliomas detects tissue changes not visualized on conventional MR images. The finding of decreasing R1 and R2 means shorter relaxation times closer to the tumor, which could reflect tumor invasion into the peritumoral edema. However, these findings need to be validated in the future. PMID

  15. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  16. Quantitative analysis of agricultural land use change in China

    NASA Astrophysics Data System (ADS)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  17. Quantitative Analysis of Chromosome Condensation in Fission Yeast

    PubMed Central

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota

    2013-01-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote. PMID:23263988

  18. Quantitative analysis of chromosome condensation in fission yeast.

    PubMed

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota; Haering, Christian H

    2013-03-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote.

  19. Drought analysis in Switzerland: spatial and temporal features

    NASA Astrophysics Data System (ADS)

    Di Franca, Gaetano; Molnar, Peter; Burlando, Paolo; Bonaccorso, Brunella; Cancelliere, Antonino

    2015-04-01

    Drought as a natural hazard may have negative impacts even in regions characterized by a general abundance of water resources. The Swiss Alpine region has experienced several extreme meteorological events (heat waves, droughts) during the last fifty years that have caused human and economic losses. Though Swiss climate is far from arid or semi-arid, natural climatic variability, exacerbated by climate change, could lead to more severe impacts from naturally occurring meteorological droughts (i.e. lack or significant reduction of precipitation) in the future. In this work, spatial and temporal features of meteorological droughts in Switzerland have been explored by the identification and probabilistic characterization of historic drought events on gridded precipitation data during the period 1961-2012. The run method has been applied to both monthly and annual precipitation time series to probabilistically characterize drought occurrences as well as to analyze their spatial variability. Spatial features have also been investigated by means of Principal Components Analysis (PCA) applied to Standardized Precipitation Index (SPI) series at 3, 6, and 12-month aggregated time scale, in order to detect areas with distinct precipitation patterns, accounting for seasonality throughout year and including both wet and dry conditions. Furthermore, a probabilistic analysis of drought areal extent has been carried out by applying an SPI-based procedure to derive Severity-Area-Frequency (SAF) curves. The application of run method reveals that Ticino and Valais are the most potentially drought-prone Swiss regions, since accumulated deficit precipitation is significantly higher (up to two times) than in the rest of the country. Inspection of SPI series reveals many events in which precipitation has shown significant anomalies from the average in the period 1961-2012 at the investigated time scales. Anomalies in rainfall seem to exhibit high spatial correlation, showing uniform sub

  20. Match analysis and temporal patterns of fatigue in rugby sevens.

    PubMed

    Granatelli, Giampietro; Gabbett, Tim J; Briotti, Gianluca; Padulo, Johnny; Buglione, Antonio; D'Ottavio, Stefano; Ruscello, Bruno M

    2014-03-01

    Rugby sevens is a rapidly growing sport. Match analysis is increasingly being used by sport scientists and coaches to improve the understanding of the physical demands of this sport. This study investigated the physical and physiological demands of elite men's rugby sevens, with special reference to the temporal patterns of fatigue during match play. Nine players, 4 backs and 5 forwards (age 25.1 ± 3.1 years) participated during 2 "Roma 7" international tournaments (2010 and 2011). All the players were at the professional level in the highest Italian rugby union, and 5 of these players also competed at the international level. During the matches (n = 15), the players were filmed to assess game performance. Global positioning system, heart rate (HR), and blood lactate (BLa) concentration data were measured and analyzed. The mean total distance covered throughout matches was 1,221 ± 118 m (first half = 643 ± 70 m and second half = 578 ± 77 m; with a decrease of 11.2%, p > 0.05, Effect Size [ES] = 0.29). The players achieved 88.3 ± 4.2 and 87.7 ± 3.4% of the HRmax during the first and second halves, respectively. The BLa for the first and second halves was 3.9 ± 0.9 and 11.2 ± 1.4 mmol·L, respectively. The decreases in performance occurred consistently in the final 3 minutes of the matches (-40.5% in the distance covered per minute). The difference found in relation to the playing position, although not statistically significant (p = 0.11), showed a large ES (η = 0.20), suggesting possible practical implications. These results demonstrate that rugby sevens is a demanding sport that places stress on both the anaerobic glycolytic and aerobic oxidative energy systems. Strength and conditioning programs designed to train these energy pathways may prevent fatigue-induced reductions in physical performance.

  1. Temporal Analysis of Remotely Sensed Precipitation Products for Hydrological Applications

    NASA Astrophysics Data System (ADS)

    Tobin, K. J.; Bennett, M. E.

    2011-12-01

    No study has systematically evaluated streamflow modeling between monthly and daily timescales. This study examines streamflow from eight watersheds across the United States where five different precipitation products were used as primary input into the Soil and Water Assessment Tool to generate simulated streamflow. Timescales examined include monthly, dekad (10 day), pentad (5 day), triad (3 day), and daily. The eight basins studied are the San Pedro (Arizona); Cimarron (north-central Oklahoma); mid-Nueces (south Texas); mid-Rio Grande (south Texas and northern Mexico), Yocano (northern Mississippi); Alapaha (south Georgia); Upper Tar (North Carolina) and mid-St. Francis (eastern Arkansas). The precipitation products used to drive simulations include rain gauge, NWS Multisensor Precipitation Estimator, Tropical Rainfall Measurement Mission, Multi-Satellite (TRMM) Precipitation Analysis, TRMM 3B42-V6, and Climate Prediction Center Morphing Method (CMORPH). Understanding how streamflow varies at sub-monthly timescales is important because there are a host of hydrological applications such a flood forecast guidance and reservoir inflow forecasts that reside in a temporal domain between monthly and daily timescales. The major finding of this study is the quantification of a strong positive correlation between performance metrics and time step at which model performance deteriorates. Basically, better performing simulations, with higher Nash-Sutcliffe values of 0.80 and above can support modeling at finer timescales to at least daily and perhaps beyond into the sub-daily realm. These findings are significant in that they clearly document the ability of SWAT to support modeling at sub-monthly time steps, which is beyond the capability for which SWAT was initially designed.

  2. Quantitative Remote Laser-Induced Breakdown Spectroscopy by Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Clegg, S. M.; Sklute, E. C.; Dyar, M. D.; Barefield, J. E.; Wiens, R. C.

    2007-12-01

    The ChemCam instrument selected for the Mars Science Laboratory (MSL) rover includes a remote Laser- Induced Breakdown Spectrometer (LIBS) that will quantitatively probe samples up to 9m from the rover mast. LIBS is fundamentally an elemental analysis technique. LIBS involves focusing a Nd:YAG laser operating at 1064 nm onto the surface of the sample. The laser ablates material from the surface, generating an expanding plasma containing electronically excited ions, atoms, and small molecules. As these electronically excited species relax back to the ground state, they emit light at wavelengths characteristic of the species present in the sample. Some of this emission is directed into one of three dispersive spectrometers. In this paper, we studied a suite of 18 igneous and highly-metamorphosed samples from a wide variety of parageneses for which chemical analyses by XRF were already available. Rocks were chosen to represent a range of chemical composition from basalt to rhyolite, thus providing significant variations in all of the major element contents (Si, Fe, Al, Ca, Na, K, O, Ti, Mg, and Mn). These samples were probed at a 9m standoff distance under experimental conditions that are similar to ChemCam. Extracting quantitative elemental concentrations from LIBS spectra is complicated by the chemical matrix effects. Conventional methods for obtaining quantitative chemical data from LIBS analyses are compared with new multivariate analysis (MVA) techniques that appear to compensate for these chemical matrix effects. The traditional analyses use specific elemental peak heights or areas, which compared with calibration curves for each element at one or more emission lines for a series of standard samples. Because of matrix effects, the calibration standards generally must have similar chemistries to the unknown samples, and thus this conventional approach imposes severe limitations on application of the technique to remote analyses. In this suite of samples, the use

  3. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  4. Binary imaging analysis for comprehensive quantitative histomorphometry of peripheral nerve.

    PubMed

    Hunter, Daniel A; Moradzadeh, Arash; Whitlock, Elizabeth L; Brenner, Michael J; Myckatyn, Terence M; Wei, Cindy H; Tung, Thomas H H; Mackinnon, Susan E

    2007-10-15

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques.

  5. Variability in quantitative cardiac magnetic resonance perfusion analysis

    PubMed Central

    Bratis, K.

    2013-01-01

    By taking advantage of its high spatial resolution, noninvasive and nontoxic nature first-pass perfusion cardiovascular magnetic resonance (CMR) has rendered an indispensable tool for the noninvasive detection of reversible myocardial ischemia. A potential advantage of perfusion CMR is its ability to quantitatively assess perfusion reserve within a myocardial segment, as expressed semi- quantitatively by myocardial perfusion reserve index (MPRI) and fully- quantitatively by absolute myocardial blood flow (MBF). In contrast to the high accuracy and reliability of CMR in evaluating cardiac function and volumes, perfusion CMR is adversely affected by multiple potential reasons during data acquisition as well as post-processing. Various image acquisition techniques, various contrast agents and doses as well as variable blood flow at rest as well as variable reactions to stress all influence the acquired data. Mechanisms underlying the variability in perfusion CMR post processing, as well as their clinical significance, are yet to be fully elucidated. The development of a universal, reproducible, accurate and easily applicable tool in CMR perfusion analysis remains a challenge and will substantially enforce the role of perfusion CMR in improving clinical care. PMID:23825774

  6. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  7. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  8. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  9. Quantitative phosphoproteomic analysis using iTRAQ method.

    PubMed

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  10. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Temporal analysis of remotely sensed turbidity in a coastal archipelago

    NASA Astrophysics Data System (ADS)

    Suominen, Tapio; Tolvanen, Harri

    2016-07-01

    A topographically fragmental archipelago with dynamic waters set the preconditions for assessing coherent remotely sensed information. We generated a turbidity dataset for an archipelago coast in the Baltic Sea from MERIS data (FSG L1b), using CoastColour L1P, L2R and L2W processors. We excluded land and mixed pixels by masking the imagery with accurate (1:10 000) shoreline data. Using temporal linear averaging (TLA), we produced satellite-imagery datasets applicable to temporal composites for the summer seasons of three years. The turbidity assessments and temporally averaged data were compared to in situ observations obtained with coastal monitoring programs. The ability of TLA to estimate missing pixel values was further assessed by cross-validation with the leave-one-out method. The correspondence between L2W turbidity and in situ observations was good (r = 0.89), and even after applying TLA the correspondence remained acceptable (r = 0.78). The datasets revealed spatially divergent temporal water characteristics, which may be relevant to the management, design of monitoring and habitat models. Monitoring observations may be spatially biased if the temporal succession of water properties is not taken into account in coastal areas with anisotropic dispersion of waters and asynchronous annual cycles. Accordingly, areas of varying turbidity may offer a different habitat for aquatic biota than areas of static turbidity, even though they may appear similar if water properties are measured for short annual periods.

  13. Volumetric brain analysis in neurosurgery: Part 3. Volumetric CT analysis as a predictor of seizure outcome following temporal lobectomy.

    PubMed

    Mandell, Jason G; Hill, Kenneth L; Nguyen, Dan T D; Moser, Kevin W; Harbaugh, Robert E; McInerney, James; Nsubuga, Brian Kaaya; Mugamba, John K; Johnson, Derek; Warf, Benjamin C; Boling, Warren; Webb, Andrew G; Schiff, Steven J

    2015-02-01

    The incidence of temporal lobe epilepsy (TLE) due to mesial temporal sclerosis (MTS) can be high in developing countries. Current diagnosis of MTS relies on structural MRI, which is generally unavailable in developing world settings. Given widespread effects on temporal lobe structure beyond hippocampal atrophy in TLE, the authors propose that CT volumetric analysis can be used in patient selection to help predict outcomes following resection. Ten pediatric patients received preoperative CT scans and temporal resections at the CURE Children's Hospital of Uganda. Engel classification of seizure control was determined 12 months postoperatively. Temporal lobe volumes were measured from CT and from normative MR images using the Cavalieri method. Whole brain and fluid volumes were measured using particle filter segmentation. Linear discrimination analysis (LDA) was used to classify seizure outcome by temporal lobe volumes and normalized brain volume. Epilepsy patients showed normal to small brain volumes and small temporal lobes bilaterally. A multivariate measure of the volume of each temporal lobe separated patients who were seizure free (Engel Class IA) from those with incomplete seizure control (Engel Class IB/IIB) with LDA (p<0.01). Temporal lobe volumes also separate normal subjects, patients with Engel Class IA outcomes, and patients with Class IB/IIB outcomes (p<0.01). Additionally, the authors demonstrated that age-normalized whole brain volume, in combination with temporal lobe volumes, may further improve outcome prediction (p<0.01). This study shows strong evidence that temporal lobe and brain volume can be predictive of seizure outcome following temporal lobe resection, and that volumetric CT analysis of the temporal lobe may be feasible in lieu of structural MRI when the latter is unavailable. Furthermore, since the authors' methods are modality independent, these findings suggest that temporal lobe and normative brain volumes may further be useful in the

  14. Altered resting-state functional activity in posttraumatic stress disorder: A quantitative meta-analysis

    PubMed Central

    Wang, Ting; Liu, Jia; Zhang, Junran; Zhan, Wang; Li, Lei; Wu, Min; Huang, Hua; Zhu, Hongyan; Kemp, Graham J.; Gong, Qiyong

    2016-01-01

    Many functional neuroimaging studies have reported differential patterns of spontaneous brain activity in posttraumatic stress disorder (PTSD), but the findings are inconsistent and have not so far been quantitatively reviewed. The present study set out to determine consistent, specific regional brain activity alterations in PTSD, using the Effect Size Signed Differential Mapping technique to conduct a quantitative meta-analysis of resting-state functional neuroimaging studies of PTSD that used either a non-trauma (NTC) or a trauma-exposed (TEC) comparison control group. Fifteen functional neuroimaging studies were included, comparing 286 PTSDs, 203 TECs and 155 NTCs. Compared with NTC, PTSD patients showed hyperactivity in the right anterior insula and bilateral cerebellum, and hypoactivity in the dorsal medial prefrontal cortex (mPFC); compared with TEC, PTSD showed hyperactivity in the ventral mPFC. The pooled meta-analysis showed hypoactivity in the posterior insula, superior temporal, and Heschl’s gyrus in PTSD. Additionally, subgroup meta-analysis (non-medicated subjects vs. NTC) identified abnormal activation in the prefrontal-limbic system. In meta-regression analyses, mean illness duration was positively associated with activity in the right cerebellum (PTSD vs. NTC), and illness severity was negatively associated with activity in the right lingual gyrus (PTSD vs. TEC). PMID:27251865

  15. 3D visualization and quantitative analysis of human erythrocyte phagocytosis.

    PubMed

    Stachurska, Anna; Król, Teodora; Trybus, Wojciech; Szary, Karol; Fabijańska-Mitek, Jadwiga

    2016-11-01

    Since the erythrophagocytosis of opsonized erythrocytes is investigated mainly by calculating the phagocytic index using subjective light microscopy evaluation, we present methods for the quantitative and qualitative analysis of human cell erythrophagocytosis. Erythrocytes from two storage periods were used. Using Imaris software, we were able to create a three-dimensional model of erythrophagocytosis. The use of microscopy instead of cytometry revealed a significantly higher number of monocytes and erythrocytes that appeared active in phagocytosis. Spatial reconstruction allowed for detailed analysis of the process by precisely locating erythrocytes in phagocytes. Additionally, a technique of sequential image registration using Nis Elements software allowed for observation of the course of phagocytosis over a range of time intervals. This in vitro research may be helpful for understanding the cellular interactions between monocytes and erythrocytes. The cytometric method-being relatively rapid, sensitive, and specific-can serve as an alternative technique to microscopy in the quantitative analysis of erythrophagocytosis. This allows us to avoid counting the erythrocytes nonspecifically attached to monocytes and gives objective results. © 2016 International Federation for Cell Biology.

  16. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  17. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  18. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  19. Spatial and Temporal Variation in Selection of Genes Associated with Pearl Millet Varietal Quantitative Traits In situ

    PubMed Central

    Mariac, Cédric; Ousseini, Issaka S.; Alio, Abdel-Kader; Jugdé, Hélène; Pham, Jean-Louis; Bezançon, Gilles; Ronfort, Joelle; Descroix, Luc; Vigouroux, Yves

    2016-01-01

    Ongoing global climate changes imply new challenges for agriculture. Whether plants and crops can adapt to such rapid changes is still a widely debated question. We previously showed adaptation in the form of earlier flowering in pearl millet at the scale of a whole country over three decades. However, this analysis did not deal with variability of year to year selection. To understand and possibly manage plant and crop adaptation, we need more knowledge of how selection acts in situ. Is selection gradual, abrupt, and does it vary in space and over time? In the present study, we tracked the evolution of allele frequency in two genes associated with pearl millet phenotypic variation in situ. We sampled 17 populations of cultivated pearl millet over a period of 2 years. We tracked changes in allele frequencies in these populations by genotyping more than seven thousand individuals. We demonstrate that several allele frequencies changes are compatible with selection, by correcting allele frequency changes associated with genetic drift. We found marked variation in allele frequencies from year to year, suggesting a variable selection effect in space and over time. We estimated the strength of selection associated with variations in allele frequency. Our results suggest that the polymorphism maintained at the genes we studied is partially explained by the spatial and temporal variability of selection. In response to environmental changes, traditional pearl millet varieties could rapidly adapt thanks to this available functional variability. PMID:27507986

  20. Temporal and multiple quantitative trait loci analyses of resistance to bacterial wilt in tomato permit the resolution of linked loci.

    PubMed Central

    Mangin, B; Thoquet, P; Olivier, J; Grimsley, N H

    1999-01-01

    Ralstonia solanacearum is a soil-borne bacterium that causes the serious disease known as bacterial wilt in many plant species. In tomato, several QTL controlling resistance have been found, but in different studies, markers spanning a large region of chromosome 6 showed strong association with the resistance. By using two different approaches to analyze the data from a field test F3 population, we show that at least two separate loci approximately 30 cM apart on this chromosome are most likely involved in the resistance. First, a temporal analysis of the progression of symptoms reveals a distal locus early in the development of the disease. As the disease progresses, the maximum LOD peak observed shifts toward the proximal end of the chromosome, obscuring the distal locus. Second, although classical interval mapping could only detect the presence of one locus, a statistical "two-QTL model" test, specifically adapted for the resolution of linked QTL, strongly supported the hypothesis for the presence of two loci. These results are discussed in the context of current molecular knowledge about disease resistance genes on chromosome 6 and observations made by tomato breeders during the production of bacterial wilt-resistant varieties. PMID:10049932

  1. Temporal and multiple quantitative trait loci analyses of resistance to bacterial wilt in tomato permit the resolution of linked loci.

    PubMed

    Mangin, B; Thoquet, P; Olivier, J; Grimsley, N H

    1999-03-01

    Ralstonia solanacearum is a soil-borne bacterium that causes the serious disease known as bacterial wilt in many plant species. In tomato, several QTL controlling resistance have been found, but in different studies, markers spanning a large region of chromosome 6 showed strong association with the resistance. By using two different approaches to analyze the data from a field test F3 population, we show that at least two separate loci approximately 30 cM apart on this chromosome are most likely involved in the resistance. First, a temporal analysis of the progression of symptoms reveals a distal locus early in the development of the disease. As the disease progresses, the maximum LOD peak observed shifts toward the proximal end of the chromosome, obscuring the distal locus. Second, although classical interval mapping could only detect the presence of one locus, a statistical "two-QTL model" test, specifically adapted for the resolution of linked QTL, strongly supported the hypothesis for the presence of two loci. These results are discussed in the context of current molecular knowledge about disease resistance genes on chromosome 6 and observations made by tomato breeders during the production of bacterial wilt-resistant varieties.

  2. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  3. Quantitative analysis of sideband coupling in photoinduced force microscopy

    NASA Astrophysics Data System (ADS)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  4. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  5. Simulating the focal volume effect: a quantitative analysis

    NASA Astrophysics Data System (ADS)

    Scarborough, Timothy D.; Uiterwaal, Cornelis J. G. J.

    2013-12-01

    We present quantitative simulations of the focal volume effect. Intensity distributions in detection volumes with two- and three-dimensional spatial resolution are calculated. Results include an analysis of translations of these volumes in the focus along the direction of laser propagation as well as discussion of varying sizes of the spatially resolved volumes. We find that detection volumes less than half the 1/e full-width beam waist and less than half the Rayleigh length along the propagation direction offer an optimal compromise of maintaining intensity resolution without sacrificing peak intensity.

  6. Neutron diffractometer INES for quantitative phase analysis of archaeological objects

    NASA Astrophysics Data System (ADS)

    Imberti, S.; Kockelmann, W.; Celli, M.; Grazzi, F.; Zoppi, M.; Botti, A.; Sodo, A.; Imperiale, M. Leo; de Vries-Melein, M.; Visser, D.; Postma, H.

    2008-03-01

    With the Italian Neutron Experimental Station (INES) a new general purpose neutron powder diffractometer is available at ISIS, characterized by a high resolution at low d-spacings, and particularly suited for the quantitative phase analysis of a wide range of archaeological materials. Time-of-flight neutron diffraction is notable for being a non-destructive technique, allowing a reliable determination of the phase compositions of multiphase artefacts, with or without superficial corrosion layers. A selection of archaeometric studies carried out during the first year of the INES user programme is presented here to demonstrate the capabilities of the instrument.

  7. The spatial and temporal analysis of forest resources and institutions

    NASA Astrophysics Data System (ADS)

    Schweik, Charles M.

    This study addresses a central puzzle facing the Human Dimensions of Global Change research community: How can we understand the influence of environmental policies on human behavior when little or no information is available on the condition of forest resources? This dissertation capitalizes on new research tools, methods and approaches to overcome the "no information about the resource" problem. Specifically, I combine (1) forest mensuration techniques, (2) Global Positioning Systems, (3) Geographic Information Systems (GIS), (4) spatial statistics, (5) remote sensing, and (6) institutional analysis to analyze forest vegetation patterns. I provide explanation of these patterns by considering the incentive structures driving human decision-making and activity and do this through two studies in very different empirical settings. Both studies apply applicable theory related to human behavior and action. Both examine the incentive structures individuals face as they undertake daily activities related to forest resources. The first study, set in East Chitwan, Nepal, identifies spatial patterns in georeferenced forest inventory data and links these to patterns predicted by optimal foraging subject to institutional constraints. The second study compares forest management in one state and one national forest in Indiana, U.S.A. In this effort, I identify spatio-temporal patterns in the forest vegetation captured by a time series of Landsat multispectral images. The combination of natural forest regrowth and property manager actions in response to incentives and constraints explain these patterns. Substantively, both studies identify change in forest resources associated with combinations of the physical, human community and institutional "landscapes" in their regions. In both cases, geographic attributes of institutions (e.g., laws, rules) are found to influence the type and location of human actions. Methodologically, the two studies provide examples of how to control

  8. Temporal scaling behavior of sea-level change in Hong Kong - Multifractal temporally weighted detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanzhi; Ge, Erjia

    2013-01-01

    The rise in global sea levels has been recognized by many scientists as an important global research issue. The process of sea-level change has demonstrated a complex scaling behavior in space and time. Large numbers of tide gauge stations have been built to measure sea-level change in the North Pacific Ocean, Indian Ocean, North Atlantic Ocean, and Antarctic Ocean. Extensive studies have been devoted to exploring sea-level variation in Asia concerning the Bohai Gulf (China), the Yellow Sea (China), the Mekong Delta (Thailand), and Singapore. Hong Kong, however, a mega city with a population of over 7 million situated in the mouth of the Pear River Estuary in the west and the South China Sea in the east, has yet to be studied, particularly in terms of the temporal scaling behavior of sea-level change. This article presents an approach to studying the temporal scaling behavior of sea-level change over multiple time scales by analyzing the time series of sea-level change in Tai Po Kou, Tsim Bei Tsui, and Quarry Bay from the periods of 1964-2010, 1974-2010, and 1986-2010, respectively. The detection of long-range correlation and multi-fractality of sea-level change seeks answers to the following questions: (1) Is the current sea-level rise associated with and responsible for the next rise over time? (2) Does the sea-level rise have specific temporal patterns manifested by multi-scaling behaviors? and (3) Is the sea-level rise is temporally heterogeneous in the different parts of Hong Kong? Multi-fractal temporally weighted de-trended fluctuation analysis (MF-TWDFA), an extension of multi-fractal de-trended fluctuation analysis (MF-DFA), has been applied in this study to identify long-range correlation and multi-scaling behavior of the sea-level rise in Hong Kong. The experimental results show that the sea-level rise is long-range correlated and multi-fractal. The temporal patterns are heterogeneous over space. This finding implies that mechanisms associated with the

  9. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  10. Quantitative chemical analysis of ocular melanosomes in the TEM.

    PubMed

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  11. Quantitative sonographic image analysis for hepatic nodules: a pilot study.

    PubMed

    Matsumoto, Naoki; Ogawa, Masahiro; Takayasu, Kentaro; Hirayama, Midori; Miura, Takao; Shiozawa, Katsuhiko; Abe, Masahisa; Nakagawara, Hiroshi; Moriyama, Mitsuhiko; Udagawa, Seiichi

    2015-10-01

    The aim of this study was to investigate the feasibility of quantitative image analysis to differentiate hepatic nodules on gray-scale sonographic images. We retrospectively evaluated 35 nodules from 31 patients with hepatocellular carcinoma (HCC), 60 nodules from 58 patients with liver hemangioma, and 22 nodules from 22 patients with liver metastasis. Gray-scale sonographic images were evaluated with subjective judgment and image analysis using ImageJ software. Reviewers classified the shape of nodules as irregular or round, and the surface of nodules as rough or smooth. Circularity values were lower in the irregular group than in the round group (median 0.823, 0.892; range 0.641-0.915, 0.784-0.932, respectively; P = 3.21 × 10(-10)). Solidity values were lower in the rough group than in the smooth group (median 0.957, 0.968; range 0.894-0.986, 0.933-0.988, respectively; P = 1.53 × 10(-4)). The HCC group had higher circularity and solidity values than the hemangioma group. The HCC and liver metastasis groups had lower median, mean, modal, and minimum gray values than the hemangioma group. Multivariate analysis showed circularity [standardized odds ratio (OR), 2.077; 95 % confidential interval (CI) = 1.295-3.331; P = 0.002] and minimum gray value (OR 0.482; 95 % CI = 0.956-0.990; P = 0.001) as factors predictive of malignancy. The combination of subjective judgment and image analysis provided 58.3 % sensitivity and 89.5 % specificity with AUC = 0.739, representing an improvement over subjective judgment alone (68.4 % sensitivity, 75.0 % specificity, AUC = 0.701) (P = 0.008). Quantitative image analysis for ultrasonic images of hepatic nodules may correlate with subjective judgment in predicting malignancy.

  12. Analysis of Nociceptive Information Encoded in the Temporal Discharge Patterns of Cutaneous C-Fibers

    PubMed Central

    Cho, Kyeongwon; Jang, Jun Ho; Kim, Sung-Phil; Lee, Sang Hoon; Chung, Soon-Cheol; Kim, In Young; Jang, Dong Pyo; Jung, Sung Jun

    2016-01-01

    The generation of pain signals from primary afferent neurons is explained by a labeled-line code. However, this notion cannot apply in a simple way to cutaneous C-fibers, which carry signals from a variety of receptors that respond to various stimuli including agonist chemicals. To represent the discharge patterns of C-fibers according to different agonist chemicals, we have developed a quantitative approach using three consecutive spikes. By using this method, the generation of pain in response to chemical stimuli is shown to be dependent on the temporal aspect of the spike trains. Furthermore, under pathological conditions, gamma-aminobutyric acid resulted in pain behavior without change of spike number but with an altered discharge pattern. Our results suggest that information about the agonist chemicals may be encoded in specific temporal patterns of signals in C-fibers, and nociceptive sensation may be influenced by the extent of temporal summation originating from the temporal patterns. PMID:27917120

  13. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    PubMed

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.

  14. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis*

    PubMed Central

    Queiroz, Rayner M. L.; Charneau, Sébastien; Mandacaru, Samuel C.; Schwämmle, Veit; Lima, Beatriz D.; Roepstorff, Peter; Ricart, Carlos A. O.

    2014-01-01

    Chagas disease is a tropical neglected disease endemic in Latin America caused by the protozoan Trypanosoma cruzi. The parasite has four major life stages: epimastigote, metacyclic trypomastigote, bloodstream trypomastigote, and amastigote. The differentiation from infective trypomastigotes into replicative amastigotes, called amastigogenesis, takes place in vivo inside mammalian host cells after a period of incubation in an acidic phagolysosome. This differentiation process can be mimicked in vitro by incubating tissue-culture-derived trypomastigotes in acidic DMEM. Here we used this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, iTRAQ-labeled, and multiplexed. Subsequently, phosphopeptides were enriched using a TiO2 matrix. Non-phosphorylated peptides were fractionated via hydrophilic interaction liquid chromatography prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification, and phosphorylation site assignment. We were able to identify regulated proteins and pathways involved in coordinating amastigogenesis. We also observed that a significant proportion of the regulated proteins were membrane proteins. Modulated phosphorylation events coordinated by protein kinases and phosphatases that are part of the signaling cascade induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of T. cruzi amastigogenesis, and these data will serve as a trustworthy basis for future studies, and possibly for new potential drug targets. PMID:25225356

  15. Quantitative Analysis of Flow through Free-swimming Appendicularians

    NASA Astrophysics Data System (ADS)

    Sutherland, K.; Conley, K. R.; Gemmell, B. J.; Thompson, E.; Bouquet, J. M.

    2016-02-01

    Appendicularians are pelagic tunicates (Phylum: Chordata, Subphylum: Tunicata) that frequently dominate the mesozooplankton community and are key grazers in both coastal shallow seas and oligotrophic environments. Understanding of their feeding mechanisms, specifically selective feeding, has important ramifications for predicting their impact on particle distributions in the upper ocean. The goal of the current study was to determine the role of flow morphology in regulating particle capture within the houses of free-swimming appendicularians (Oikopleura dioica). We used two methods— standard Particle Image Velocimetry (PIV) with laser sheet illumination and bright field micro-PIV— to gain unprecedented spatial and temporal resolution of body kinematics and fluid motion through the mucous-mesh house. Analysis of small-scale fluid interactions at various parts of the house provided insight into factors that influence particle capture and selection in these important grazers.

  16. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila.

    PubMed

    Itskov, Pavel M; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H; Ribeiro, Carlos

    2014-08-04

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake.

  17. Automated monitoring and quantitative analysis of feeding behaviour in Drosophila

    PubMed Central

    Itskov, Pavel M.; Moreira, José-Maria; Vinnik, Ekaterina; Lopes, Gonçalo; Safarik, Steve; Dickinson, Michael H.; Ribeiro, Carlos

    2014-01-01

    Food ingestion is one of the defining behaviours of all animals, but its quantification and analysis remain challenging. This is especially the case for feeding behaviour in small, genetically tractable animals such as Drosophila melanogaster. Here, we present a method based on capacitive measurements, which allows the detailed, automated and high-throughput quantification of feeding behaviour. Using this method, we were able to measure the volume ingested in single sips of an individual, and monitor the absorption of food with high temporal resolution. We demonstrate that flies ingest food by rhythmically extending their proboscis with a frequency that is not modulated by the internal state of the animal. Instead, hunger and satiety homeostatically modulate the microstructure of feeding. These results highlight similarities of food intake regulation between insects, rodents, and humans, pointing to a common strategy in how the nervous systems of different animals control food intake. PMID:25087594

  18. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  19. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  20. Quantitative analysis of live cells using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Lewis, Tan Rongwei; Qu, Weijuan; Chee, Oi Choo; Singh, Vijay Raj; Asundi, Anand

    2010-03-01

    During the life time of a cell, it goes through changes to the plasma membrane as well as its internal structures especially distinctive during processes like cell division and death. Different types of microscope are used to fulfill the observation of the cell's variation. In our experiment, Vero cells have been investigated by using phase contrast microscopy and digital holographic microscopy (DHM). A comparison of the images obtained for cell division is presented here. The conventional phase contrast microscope provided a good imaging method in the real time analysis of cell division. The off-axis digital hologram recorded by the DHM system can be reconstructed to obtain both the intensity image and phase contrast image of the test object. These can be used for live cell imaging to provide multiple results from a single equipment setup. The DHM system, besides being a qualitative tool, is able to provide quantitative results and 3D images of the cell division process. The ability of DHM to provide quantitative analysis makes it an ideal tool for life science applications.

  1. Quantitative analysis of live cells using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Lewis, Tan Rongwei; Qu, Weijuan; Chee, Oi Choo; Singh, Vijay Raj; Asundi, Anand

    2009-12-01

    During the life time of a cell, it goes through changes to the plasma membrane as well as its internal structures especially distinctive during processes like cell division and death. Different types of microscope are used to fulfill the observation of the cell's variation. In our experiment, Vero cells have been investigated by using phase contrast microscopy and digital holographic microscopy (DHM). A comparison of the images obtained for cell division is presented here. The conventional phase contrast microscope provided a good imaging method in the real time analysis of cell division. The off-axis digital hologram recorded by the DHM system can be reconstructed to obtain both the intensity image and phase contrast image of the test object. These can be used for live cell imaging to provide multiple results from a single equipment setup. The DHM system, besides being a qualitative tool, is able to provide quantitative results and 3D images of the cell division process. The ability of DHM to provide quantitative analysis makes it an ideal tool for life science applications.

  2. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  3. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  4. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    PubMed

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  5. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-05

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  6. Matching Pursuit Analysis of Auditory Receptive Fields' Spectro-Temporal Properties

    PubMed Central

    Bach, Jörg-Hendrik; Kollmeier, Birger; Anemüller, Jörn

    2017-01-01

    Gabor filters have long been proposed as models for spectro-temporal receptive fields (STRFs), with their specific spectral and temporal rate of modulation qualitatively replicating characteristics of STRF filters estimated from responses to auditory stimuli in physiological data. The present study builds on the Gabor-STRF model by proposing a methodology to quantitatively decompose STRFs into a set of optimally matched Gabor filters through matching pursuit, and by quantitatively evaluating spectral and temporal characteristics of STRFs in terms of the derived optimal Gabor-parameters. To summarize a neuron's spectro-temporal characteristics, we introduce a measure for the “diagonality,” i.e., the extent to which an STRF exhibits spectro-temporal transients which cannot be factorized into a product of a spectral and a temporal modulation. With this methodology, it is shown that approximately half of 52 analyzed zebra finch STRFs can each be well approximated by a single Gabor or a linear combination of two Gabor filters. Moreover, the dominant Gabor functions tend to be oriented either in the spectral or in the temporal direction, with truly “diagonal” Gabor functions rarely being necessary for reconstruction of an STRF's main characteristics. As a toy example for the applicability of STRF and Gabor-STRF filters to auditory detection tasks, we use STRF filters as features in an automatic event detection task and compare them to idealized Gabor filters and mel-frequency cepstral coefficients (MFCCs). STRFs classify a set of six everyday sounds with an accuracy similar to reference Gabor features (94% recognition rate). Spectro-temporal STRF and Gabor features outperform reference spectral MFCCs in quiet and in low noise conditions (down to 0 dB signal to noise ratio). PMID:28232791

  7. Advances in Temporal Analysis in Learning and Instruction

    ERIC Educational Resources Information Center

    Molenaar, Inge

    2014-01-01

    This paper focuses on a trend to analyse temporal characteristics of constructs important to learning and instruction. Different researchers have indicated that we should pay more attention to time in our research to enhance explanatory power and increase validity. Constructs formerly viewed as personal traits, such as self-regulated learning and…

  8. Cubic map algebra functions for spatio-temporal analysis

    USGS Publications Warehouse

    Mennis, J.; Viger, R.; Tomlin, C.D.

    2005-01-01

    We propose an extension of map algebra to three dimensions for spatio-temporal data handling. This approach yields a new class of map algebra functions that we call "cube functions." Whereas conventional map algebra functions operate on data layers representing two-dimensional space, cube functions operate on data cubes representing two-dimensional space over a third-dimensional period of time. We describe the prototype implementation of a spatio-temporal data structure and selected cube function versions of conventional local, focal, and zonal map algebra functions. The utility of cube functions is demonstrated through a case study analyzing the spatio-temporal variability of remotely sensed, southeastern U.S. vegetation character over various land covers and during different El Nin??o/Southern Oscillation (ENSO) phases. Like conventional map algebra, the application of cube functions may demand significant data preprocessing when integrating diverse data sets, and are subject to limitations related to data storage and algorithm performance. Solutions to these issues include extending data compression and computing strategies for calculations on very large data volumes to spatio-temporal data handling.

  9. Final Report for Dynamic Models for Causal Analysis of Panel Data. Methods for Temporal Analysis. Part I, Chapter 1.

    ERIC Educational Resources Information Center

    Hannan, Michael T.; Tuma, Nancy Brandon

    This document is part of a series of chapters described in SO 011 759. Working from the premise that temporal analysis is indispensable for the study of change, the document examines major alternatives in research design of this nature. Five sections focus on the features, advantages, and limitations of temporal analysis. Four designs which…

  10. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  11. 3D quantitative analysis of brain SPECT images

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Ceskovic, Ivan; Petrovic, Ratimir; Loncaric, Srecko

    2001-07-01

    The main purpose of this work is to develop a computer-based technique for quantitative analysis of 3-D brain images obtained by single photon emission computed tomography (SPECT). In particular, the volume and location of ischemic lesion and penumbra is important for early diagnosis and treatment of infracted regions of the brain. SPECT imaging is typically used as diagnostic tool to assess the size and location of the ischemic lesion. The segmentation method presented in this paper utilizes a 3-D deformable model in order to determine size and location of the regions of interest. The evolution of the model is computed using a level-set implementation of the algorithm. In addition to 3-D deformable model the method utilizes edge detection and region growing for realization of a pre-processing. Initial experimental results have shown that the method is useful for SPECT image analysis.

  12. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  13. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    PubMed

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  14. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  15. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  16. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  17. Leishmaniasis, conflict, and political terror: A spatio-temporal analysis.

    PubMed

    Berry, Isha; Berrang-Ford, Lea

    2016-10-01

    Leishmaniasis has been estimated to cause the ninth largest burden amongst global infectious diseases. Occurrence of the disease has been anecdotally associated with periods of conflict, leading to its referral as a disease of 'guerrilla warfare.' Despite this, there have been few studies that quantitatively investigate the extent to which leishmaniasis coincides with conflict or political terror. This study employed a longitudinal approach to empirically test for an association between cutaneous and visceral leishmaniasis incidence with occurrence of conflict and political terror at the national level, annually for 15 years (1995-2010). Leishmaniasis incidence data were collected for 54 countries, and combined with UCDP/PRIO Armed Conflict and Amnesty International political terror datasets. Mixed effects negative binomial regression models clustered at the country-level were constructed to evaluate the incidence rate ratios against the predictors, while controlling for wealth. Additionally, to understand how and why conflict-terror may be associated with leishmaniasis incidence, we conducted a historical analysis. We identify and discuss posited causal mechanisms in the literature, and critically assessed pathways by which leishmaniasis might occur in places and times of conflict-terror. There was a significant dose-response relationship for disease incidence based on increasing levels of conflict and terror. Country-years experiencing very high levels of conflict-terror were associated with a 2.38 times higher [95% CI: 1.40-4.05] and 6.02 times higher [95% CI: 2.39-15.15] incidence of cutaneous and visceral leishmaniasis, respectively. Historical analysis indicated that conflict and terror contribute to-or coincide with-leishmaniasis incidence through processes of population displacement and health system deterioration. This research highlights the potentially increased risks for cutaneous and visceral leishmaniasis incidence in areas of high conflict

  18. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  19. Quick 96FASP for high throughput quantitative proteome analysis.

    PubMed

    Yu, Yanbao; Bekele, Shiferaw; Pieper, Rembert

    2017-08-23

    Filter aided sample preparation (FASP) is becoming a central method for proteomic sample cleanup and peptide generation prior to LC-MS analysis. We previously adapted this method to a 96-well filter plate, and applied to prepare protein digests from cell lysate and body fluid samples in a high throughput quantitative manner. While the 96FASP approach is scalable and can handle multiple samples simultaneously, two key advantages compared to single FASP, it is also time-consuming. The centrifugation-based liquid transfer on the filter plate takes 3-5 times longer than single filter. To address this limitation, we now present a quick 96FASP (named q96FASP) approach that, relying on the use of filter membranes with a large MWCO size (~30kDa), significantly reduces centrifugal times. We show that q96FASP allows the generation of protein digests derived from whole cell lysates and body fluids in a quality similar to that of the single FASP method. Processing a sample in multiple wells in parallel, we observed excellent experimental repeatability by label-free quantitation approach. We conclude that the q96FASP approach promises to be a promising cost- and time-effective method for shotgun proteomics and will be particularly useful in large scale biomarker discovery studies. High throughput sample processing is of particular interests for quantitative proteomics. The previously developed 96FASP is high throughput and appealing, however it is time-consuming in the context of centrifugation-based liquid transfer (~1.5h per spin). This study presents a truly high throughput sample preparation method based on large cut-off 96-well filter plate, which shortens the spin time to ~20min. To our knowledge, this is the first multi-well method that is entirely comparable with conventional FASP. This study thoroughly examined two types of filter plates and performed side-by-side comparisons with single FASP. Two types of samples, whole cell lysate of a UTI (urinary tract infection

  20. The workflow for quantitative proteome analysis of chloroplast development and differentiation, chloroplast mutants, and protein interactions by spectral counting.

    PubMed

    Friso, Giulia; Olinares, Paul Dominic B; van Wijk, Klaas J

    2011-01-01

    This chapter outlines a quantitative proteomics workflow using a label-free spectral counting technique. The workflow has been tested on different aspects of chloroplast biology in maize and Arabidopsis, including chloroplast mutant analysis, cell-type specific chloroplast differentiation, and the proplastid-to-chloroplast transition. The workflow involves one-dimensional SDS-PAGE of the proteomes of leaves or chloroplast subfractions, tryptic digestions, online LC-MS/MS using a mass spectrometer with high mass accuracy and duty cycle, followed by semiautomatic data processing. The bioinformatics analysis can effectively select best gene models and deals with quantification of closely related proteins; the workflow avoids overidentification of proteins and results in more accurate protein quantification. The final output includes pairwise comparative quantitative analysis, as well as hierarchical clustering for discovery of temporal and spatial patterns of protein accumulation. A brief discussion about potential pitfalls, as well as the advantages and disadvantages of spectral counting, is provided.

  1. Temporal proteomic analysis of HIV infection reveals remodelling of the host phosphoproteome by lentiviral Vif variants

    PubMed Central

    Greenwood, Edward JD; Matheson, Nicholas J; Wals, Kim; van den Boomen, Dick JH; Antrobus, Robin; Williamson, James C; Lehner, Paul J

    2016-01-01

    Viruses manipulate host factors to enhance their replication and evade cellular restriction. We used multiplex tandem mass tag (TMT)-based whole cell proteomics to perform a comprehensive time course analysis of >6500 viral and cellular proteins during HIV infection. To enable specific functional predictions, we categorized cellular proteins regulated by HIV according to their patterns of temporal expression. We focussed on proteins depleted with similar kinetics to APOBEC3C, and found the viral accessory protein Vif to be necessary and sufficient for CUL5-dependent proteasomal degradation of all members of the B56 family of regulatory subunits of the key cellular phosphatase PP2A (PPP2R5A-E). Quantitative phosphoproteomic analysis of HIV-infected cells confirmed Vif-dependent hyperphosphorylation of >200 cellular proteins, particularly substrates of the aurora kinases. The ability of Vif to target PPP2R5 subunits is found in primate and non-primate lentiviral lineages, and remodeling of the cellular phosphoproteome is therefore a second ancient and conserved Vif function. DOI: http://dx.doi.org/10.7554/eLife.18296.001 PMID:27690223

  2. Quantitative analysis of cyclic beta-turn models.

    PubMed Central

    Perczel, A.; Fasman, G. D.

    1992-01-01

    The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results

  3. Quantitative analysis of cyclic beta-turn models.

    PubMed

    Perczel, A; Fasman, G D

    1992-03-01

    The beta-turn is a frequently found structural unit in the conformation of globular proteins. Although the circular dichroism (CD) spectra of the alpha-helix and beta-pleated sheet are well defined, there remains some ambiguity concerning the pure component CD spectra of the different types of beta-turns. Recently, it has been reported (Hollósi, M., Kövér, K.E., Holly, S., Radics, L., & Fasman, G.D., 1987, Biopolymers 26, 1527-1572; Perczel, A., Hollósi, M., Foxman, B.M., & Fasman, G.D., 1991a, J. Am. Chem. Soc. 113, 9772-9784) that some pseudohexapeptides (e.g., the cyclo[(delta)Ava-Gly-Pro-Aaa-Gly] where Aaa = Ser, Ser(OtBu), or Gly) in many solvents adopt a conformational mixture of type I and the type II beta-turns, although the X-ray-determined conformation was an ideal type I beta-turn. In addition to these pseudohexapeptides, conformational analysis was also carried out on three pseudotetrapeptides and three pseudooctapeptides. The target of the conformation analysis reported herein was to determine whether the ring stress of the above beta-turn models has an influence on their conformational properties. Quantitative nuclear Overhauser effect (NOE) measurements yielded interproton distances. The conformational average distances so obtained were interpreted utilizing molecular dynamics (MD) simulations to yield the conformational percentages. These conformational ratios were correlated with the conformational weights obtained by quantitative CD analysis of the same compounds. The pure component CD curves of type I and type II beta-turns were also obtained, using a recently developed algorithm (Perczel, A., Tusnády, G., Hollósi, M., & Fasman, G.D., 1991b, Protein Eng. 4(6), 669-679). For the first time the results of a CD deconvolution, based on the CD spectra of 14 beta-turn models, were assigned by quantitative NOE results. The NOE experiments confirmed the ratios of the component curves found for the two major beta-turns by CD analysis. These results

  4. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  5. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    PubMed

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  6. Quantitative FRET Analysis by Fast Acquisition Time Domain FLIM at High Spatial Resolution in Living Cells

    PubMed Central

    Padilla-Parra, Sergi; Audugé, Nicolas; Coppey-Moisan, Maïté; Tramier, Marc

    2008-01-01

    Quantitative analysis in Förster resonance energy transfer (FRET) experiments in live cells for protein interaction studies is still a challenging issue. In a two-component system (FRET and no FRET donor species), fitting of fluorescence lifetime imaging microscopy (FLIM) data gives the fraction of donor molecules involved in FRET (fD) and the intrinsic transfer efficiency. But when fast FLIM acquisitions are used to monitor dynamic changes in protein-protein interactions at high spatial and temporal resolutions in living cells, photon statistics and time resolution are limited. In this case, fitting procedures are not reliable, even for single lifetime donors. We introduce the new concept of a minimal fraction of donor molecules involved in FRET (mfD), coming from the mathematical minimization of fD. We find particular advantage in the use of mfD because it can be obtained without fitting procedures and it is derived directly from FLIM data. mfD constitutes an interesting quantitative parameter for live cell studies because it is related to the minimal relative concentration of interacting proteins. For multi-lifetime donors, the process of fitting complex fluorescence decays to find at least four reliable lifetimes is a near impossible task. Here, mfD extension for multi-lifetime donors is the only quantitative determinant. We applied this methodology for imaging the interaction between the bromodomains of TAFII250 and acetylated histones H4 in living cells at high resolution. We show the existence of discrete acetylated chromatin domains where the minimal fraction of bromodomain interacting with acetylated H4 oscillates from 0.26 to 0.36 and whose size is smaller than half of one micron cube. We demonstrate that mfD by itself is a useful tool to investigate quantitatively protein interactions in live cells, especially when using fast FRET-FLIM acquisition times. PMID:18539634

  7. Geovisualization Approaches for Spatio-temporal Crime Scene Analysis - Towards 4D Crime Mapping

    NASA Astrophysics Data System (ADS)

    Wolff, Markus; Asche, Hartmut

    This paper presents a set of methods and techniques for analysis and multidimensional visualisation of crime scenes in a German city. As a first step the approach implies spatio-temporal analysis of crime scenes. Against this background a GIS-based application is developed that facilitates discovering initial trends in spatio-temporal crime scene distributions even for a GIS untrained user. Based on these results further spatio-temporal analysis is conducted to detect variations of certain hotspots in space and time. In a next step these findings of crime scene analysis are integrated into a geovirtual environment. Behind this background the concept of the space-time cube is adopted to allow for visual analysis of repeat burglary victimisation. Since these procedures require incorporating temporal elements into virtual 3D environments, basic methods for 4D crime scene visualisation are outlined in this paper.

  8. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/).

  9. In vivo osteogenesis assay: a rapid method for quantitative analysis.

    PubMed

    Dennis, J E; Konstantakos, E K; Arm, D; Caplan, A I

    1998-08-01

    A quantitative in vivo osteogenesis assay is a useful tool for the analysis of cells and bioactive factors that affect the amount or rate of bone formation. There are currently two assays in general use for the in vivo assessment of osteogenesis by isolated cells: diffusion chambers and porous calcium phosphate ceramics. Due to the relative ease of specimen preparation and reproducibility of results, the porous ceramic assay was chosen for the development of a rapid method for quantitating in vivo bone formation. The ceramic cube implantation technique consists of combining osteogenic cells with 27-mm3 porous calcium phosphate ceramics, implanting the cell-ceramic composites subcutaneously into an immuno-tolerant host, and, after 2-6 weeks, harvesting and preparing the ceramic implants for histologic analysis. A drawback to the analysis of bone formation within these porous ceramics is that the entire cube must be examined to find small foci of bone present in some samples; a single cross-sectional area is not representative. For this reason, image analysis of serial sections from ceramics is often prohibitively time-consuming. Two alternative scoring methodologies were tested and compared to bone volume measurements obtained by image analysis. The two subjective scoring methods were: (1) Bone Scale: the amount of bone within pores of the ceramic implant is estimated on a scale of 0-4 based on the degree of bone fill (0=no bone, 1=up to 25%, 2=25 to 75%, 4=75 to 100% fill); and (2) Percentage Bone: the amount of bone is estimated by determining the percentage of ceramic pores which contain bone. Every tenth section of serially sectioned cubes was scored by each of these methods under double-blind conditions, and the Bone Scale and Percentage Bone results were directly compared to image analysis measurements from identical samples. Correlation coefficients indicate that the Percentage Bone method was more accurate than the Bone Scale scoring method. The Bone Scale

  10. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  11. From screening to quantitative sensitivity analysis. A unified approach

    NASA Astrophysics Data System (ADS)

    Campolongo, Francesca; Saltelli, Andrea; Cariboni, Jessica

    2011-04-01

    The present work is a sequel to a recent one published on this journal where the superiority of 'radial design' to compute the 'total sensitivity index' was ascertained. Both concepts belong to sensitivity analysis of model output. A radial design is the one whereby starting from a random point in the hyperspace of the input factors one step in turn is taken for each factor. The procedure is iterated a number of times with a different starting random point as to collect a sample of elementary shifts for each factor. The total sensitivity index is a powerful sensitivity measure which can be estimated based on such a sample. Given the similarity between the total sensitivity index and a screening test known as method of the elementary effects (or method of Morris), we test the radial design on this method. Both methods are best practices: the total sensitivity index in the class of the quantitative measures and the elementary effects in that of the screening methods. We find that the radial design is indeed superior even for the computation of the elementary effects method. This opens the door to a sensitivity analysis strategy whereby the analyst can start with a small number of points (screening-wise) and then - depending on the results - possibly increase the numeral of points up to compute a fully quantitative measure. Also of interest to practitioners is that a radial design is nothing else than an iterated 'One factor At a Time' (OAT) approach. OAT is a radial design of size one. While OAT is not a good practice, modelers in all domains keep using it for sensitivity analysis for reasons discussed elsewhere (Saltelli and Annoni, 2010) [23]. With the present approach modelers are offered a straightforward and economic upgrade of their OAT which maintain OAT's appeal of having just one factor moved at each step.

  12. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  13. Quantitative Analysis of Single-Molecule RNA-Protein Interaction

    PubMed Central

    Fuhrmann, Alexander; Schoening, Jan C.; Anselmetti, Dario; Staiger, Dorothee; Ros, Robert

    2009-01-01

    Abstract RNA-binding proteins impact gene expression at the posttranscriptional level by interacting with cognate cis elements within the transcripts. Here, we apply dynamic single-molecule force spectroscopy to study the interaction of the Arabidopsis glycine-rich RNA-binding protein AtGRP8 with its RNA target. A dwell-time-dependent analysis of the single-molecule data in combination with competition assays and site-directed mutagenesis of both the RNA target and the RNA-binding domain of the protein allowed us to distinguish and quantify two different binding modes. For dwell times <0.21 s an unspecific complex with a lifetime of 0.56 s is observed, whereas dwell times >0.33 s result in a specific interaction with a lifetime of 208 s. The corresponding reaction lengths are 0.28 nm for the unspecific and 0.55 nm for the specific AtGRP8-RNA interactions, indicating formation of a tighter complex with increasing dwell time. These two binding modes cannot be dissected in ensemble experiments. Quantitative titration in RNA bandshift experiments yields an ensemble-averaged equilibrium constant of dissociation of KD = 2 × 10−7 M. Assuming comparable on-rates for the specific and nonspecific binding modes allows us to estimate their free energies as ΔG0 = −42 kJ/mol and ΔG0 = −28 kJ/mol for the specific and nonspecific binding modes, respectively. Thus, we show that single-molecule force spectroscopy with a refined statistical analysis is a potent tool for the analysis of protein-RNA interactions without the drawback of ensemble averaging. This makes it possible to discriminate between different binding modes or sites and to analyze them quantitatively. We propose that this method could be applied to complex interactions of biomolecules in general, and be of particular interest for the investigation of multivalent binding reactions. PMID:19527663

  14. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  16. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  17. Hydrodynamics Flow and Transport Characterization of a Karstified Physical Model Using Temporal Moment Analysis

    NASA Astrophysics Data System (ADS)

    Anaya, A. A.; Padilla, I. Y.

    2013-12-01

    High productivity of karst groundwater systems is often associated with conduit flow and high matrix permeability. Spatial heterogeneities and anisotropy, among others factors, result in highly complex flow patterns in these systems. The same characteristics that make these aquifers very productive also make them highly vulnerable to contamination and a likely for contaminant exposure. The understanding of contamination fate and transport processes in these complex aquifers demand different statistical and numerical approaches, such as the Temporal Moment Analysis (TMA). TMA of solute breakthrough curves provide qualitative and quantitative results to characterize hydrodynamic variables that affect the release, mobility, persistence, and possible pathways of contaminants in karst groundwater systems. The general objective of this work is to characterize flow and transport processes in conduit and diffusion-dominated flow under low and high flow conditions using TMA in a karstified physical model. A multidimensional, laboratory-scale, Geo-Hydrobed model (GHM) containing a karstified limestone block collected from the karst aquifer formation of northern Puerto Rico are used for this purpose. Experimental work entails injecting dissolved CaCl2 and trichloroethene (TCE) in the upstream boundary of the GHM while monitoring their concentrations spatially and temporally in the limestone under different groundwater flow regimes. Results from the TMA show a highly heterogeneous system resulting in large preferential flow components and specific mass-transfer limitations zones especially in diffuse flow areas. Flow variables like velocity and Reynolds number indicates defined preferential flow paths increasing spatially as flow rate increase. TMA results show to be qualitatively consistent with a previous statistical novel approach developed using mixed models. Comparison between the dissolved CaCl2 tracer and TCE show implications for reactive contaminants in the karst

  18. Genetic architecture and temporal patterns of biomass accumulation in spring barley revealed by image analysis.

    PubMed

    Neumann, Kerstin; Zhao, Yusheng; Chu, Jianting; Keilwagen, Jens; Reif, Jochen C; Kilian, Benjamin; Graner, Andreas

    2017-08-10

    Genetic mapping of phenotypic traits generally focuses on a single time point, but biomass accumulates continuously during plant development. Resolution of the temporal dynamics that affect biomass recently became feasible using non-destructive imaging. With the aim to identify key genetic factors for vegetative biomass formation from the seedling stage to flowering, we explored growth over time in a diverse collection of two-rowed spring barley accessions. High heritabilities facilitated the temporal analysis of trait relationships and identification of quantitative trait loci (QTL). Biomass QTL tended to persist only a short period during early growth. More persistent QTL were detected around the booting stage. We identified seven major biomass QTL, which together explain 55% of the genetic variance at the seedling stage, and 43% at the booting stage. Three biomass QTL co-located with genes or QTL involved in phenology. The most important locus for biomass was independent from phenology and is located on chromosome 7HL at 141 cM. This locus explained ~20% of the genetic variance, was significant over a long period of time and co-located with HvDIM, a gene involved in brassinosteroid synthesis. Biomass is a dynamic trait and is therefore orchestrated by different QTL during early and late growth stages. Marker-assisted selection for high biomass at booting stage is most effective by also including favorable alleles from seedling biomass QTL. Selection for dynamic QTL may enhance genetic gain for complex traits such as biomass or, in the future, even grain yield.

  19. Temporal coherence and attention in auditory scene analysis.

    PubMed

    Shamma, Shihab A; Elhilali, Mounya; Micheyl, Christophe

    2011-03-01

    Humans and other animals can attend to one of multiple sounds and follow it selectively over time. The neural underpinnings of this perceptual feat remain mysterious. Some studies have concluded that sounds are heard as separate streams when they activate well-separated populations of central auditory neurons, and that this process is largely pre-attentive. Here, we argue instead that stream formation depends primarily on temporal coherence between responses that encode various features of a sound source. Furthermore, we postulate that only when attention is directed towards a particular feature (e.g. pitch) do all other temporally coherent features of that source (e.g. timbre and location) become bound together as a stream that is segregated from the incoherent features of other sources.

  20. Local Spatio-Temporal Analysis in Vision Systems

    DTIC Science & Technology

    1994-07-31

    stereopsis . (b) To determine the relevant spatio-temporal contrast information for the sensory/motor systems involved in vergence eye movements...models of stereopsis . Progress on objectives (a) and (b) is described here; progress on objectives (c) and (d) is described under Aim 4. During the past... stereopsis (Cormack) In this study Cormack is measuring processing asymmetries in the disparity domain. While one of the original theories about the

  1. Multi-Temporal Analysis of Landsat Imagery for Bathymetry.

    DTIC Science & Technology

    1983-05-01

    assembled for calibration areas 3A, 3B, 3C, and 3D (see Figure 1, page 10). Each of these data sets consisted of the average ship-measured depth over...mathematical details of the parameter adjustment procedure were developed. Second, a multi-temporal data set was assembled from six previously processed...surface and atmospheric effects can lead to significant errors in this term. The extinction coefficient could be highly variable both spatially and

  2. Quantitative Analysis of Peripheral Tissue Perfusion Using Spatiotemporal Molecular Dynamics

    PubMed Central

    Lee, Jungsul; Koh, Gou Young; Kwon, Kihwan; Choi, Chulhee

    2009-01-01

    Background Accurate measurement of peripheral tissue perfusion is challenging but necessary to diagnose peripheral vascular insufficiency. Because near infrared (NIR) radiation can penetrate relatively deep into tissue, significant attention has been given to intravital NIR fluorescence imaging. Methodology/Principal Findings We developed a new optical imaging-based strategy for quantitative measurement of peripheral tissue perfusion by time-series analysis of local pharmacokinetics of the NIR fluorophore, indocyanine green (ICG). Time-series NIR fluorescence images were obtained after injecting ICG intravenously in a murine hindlimb ischemia model. Mathematical modeling and computational simulations were used for translating time-series ICG images into quantitative pixel perfusion rates and a perfusion map. We could successfully predict the prognosis of ischemic hindlimbs based on the perfusion profiles obtained immediately after surgery, which were dependent on the preexisting collaterals. This method also reflected increases in perfusion and improvements in prognosis of ischemic hindlimbs induced by treatment with vascular endothelial growth factor and COMP-angiopoietin-1. Conclusions/Significance We propose that this novel NIR-imaging-based strategy is a powerful tool for biomedical studies related to the evaluation of therapeutic interventions directed at stimulating angiogenesis. PMID:19169354

  3. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  4. Quantitative analysis of tumor burden in mouse lung via MRI.

    PubMed

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  5. Advance in orientation microscopy: quantitative analysis of nanocrystalline structures.

    PubMed

    Seyring, Martin; Song, Xiaoyan; Rettenmayr, Markus

    2011-04-26

    The special properties of nanocrystalline materials are generally accepted to be a consequence of the high density of planar defects (grain and twin boundaries) and their characteristics. However, until now, nanograin structures have not been characterized with similar detail and statistical relevance as coarse-grained materials, due to the lack of an appropriate method. In the present paper, a novel method based on quantitative nanobeam diffraction in transmission electron microscopy (TEM) is presented to determine the misorientation of adjacent nanograins and subgrains. Spatial resolution of <5 nm can be achieved. This method is applicable to characterize orientation relationships in wire, film, and bulk materials with nanocrystalline structures. As a model material, nanocrystalline Cu is used. Several important features of the nanograin structure are discovered utilizing quantitative analysis: the fraction of twin boundaries is substantially higher than that observed in bright-field images in the TEM; small angle grain boundaries are prominent; there is an obvious dependence of the grain boundary characteristics on grain size distribution and mean grain size.

  6. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  7. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    PubMed

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Multi-Temporal Analysis of Landscapes and Urban Areas

    NASA Astrophysics Data System (ADS)

    Nocerino, E.; Menna, F.; Remondino, F.

    2012-07-01

    This article presents a 4D modelling approach that employs multi-temporal and historical aerial images to derive spatio-temporal information for scenes and landscapes. Such imagery represent a unique data source, which combined with photo interpretation and reality-based 3D reconstruction techniques, can offer a more complete modelling procedure because it adds the fourth dimension of time to 3D geometrical representation and thus, allows urban planners, historians, and others to identify, describe, and analyse changes in individual scenes and buildings as well as across landscapes. Particularly important to this approach are historical aerial photos, which provide data about the past that can be collected, processed, and then integrated as a database. The proposed methodology employs both historical (1945) and more recent (1973 and 2000s) aerial images from the Trentino region in North-eastern Italy in order to create a multi-temporal database of information to assist researchers in many disciplines such as topographic mapping, geology, geography, architecture, and archaeology as they work to reconstruct building phases and to understand landscape transformations (Fig. 1).

  9. Quantitative three-dimensional holographic interferometry for flow field analysis

    NASA Astrophysics Data System (ADS)

    Holden, C. M. E.; Parker, S. C. J.; Bryanston-Cross, P. J.

    Holographic interferometry offers the potential for quantitative, wholefield analysis of three-dimensional compressible flows. The technique is non-intrusive, does not require the introduction of seeding particles, and records the entire flow information within the pulse duration of a Q-switched ruby laser (~30ns). At present, however, holographic interferometry is mainly used qualitatively due to the practical restrictions of data recording, acquisition and processing. To address the potential of holographic flow analysis a prototype multi-channel interferometer has been designed and preliminary wind tunnel results have been obtained. The proposed configuration uses specular illumination which, unlike comparable diffuse systems, does not suffer from fringe localisation and speckle noise. Beam collimation and steering through the flow field is achieved in a single operation by the use of holographic optical elements (HOEs). The resulting design is compact, light efficient, has aberration compensation, and the recorded data are conducive to both tomographic analysis and direct comparison to computational fluid dynamics (CFD) predictions. Holograms have been recorded of simple two-dimensional and axisymmetric compressible flows, to compare the accuracy of holographic density measurements with data from conventional pressure sensors and CFD codes. Data extraction from the holograms, and the elimination of rigid body motion, was achieved using digital Fourier transform fringe analysis. The introduction of phase errors by image processing has been investigated by analysing simulated fringe patterns generated from a combination of experimental amplitude information and computer generated phase data.

  10. Multipoint quantitative-trait linkage analysis in general pedigrees.

    PubMed Central

    Almasy, L; Blangero, J

    1998-01-01

    Multipoint linkage analysis of quantitative-trait loci (QTLs) has previously been restricted to sibships and small pedigrees. In this article, we show how variance-component linkage methods can be used in pedigrees of arbitrary size and complexity, and we develop a general framework for multipoint identity-by-descent (IBD) probability calculations. We extend the sib-pair multipoint mapping approach of Fulker et al. to general relative pairs. This multipoint IBD method uses the proportion of alleles shared identical by descent at genotyped loci to estimate IBD sharing at arbitrary points along a chromosome for each relative pair. We have derived correlations in IBD sharing as a function of chromosomal distance for relative pairs in general pedigrees and provide a simple framework whereby these correlations can be easily obtained for any relative pair related by a single line of descent or by multiple independent lines of descent. Once calculated, the multipoint relative-pair IBDs can be utilized in variance-component linkage analysis, which considers the likelihood of the entire pedigree jointly. Examples are given that use simulated data, demonstrating both the accuracy of QTL localization and the increase in power provided by multipoint analysis with 5-, 10-, and 20-cM marker maps. The general pedigree variance component and IBD estimation methods have been implemented in the SOLAR (Sequential Oligogenic Linkage Analysis Routines) computer package. PMID:9545414

  11. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  13. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  14. Quantitative analysis of creatinine in urine by metalized nanostructured parylene

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Malvadkar, Niranjan; Koytek, S.; Bylander, J.; Reeves, W. Brian; Demirel, Melik C.

    2010-03-01

    A highly accurate, real-time multisensor agent monitor for biomarker detection is required for early detection of kidney diseases. Urine creatinine level can provide useful information on the status of the kidney. We prepare nanostructured surface-enhanced Raman spectroscopy (SERS) substrates without template or lithography, which provides controllable, well-organized nanostructures on the surface, for the quantitative analysis of creatinine concentration in urine. We present our work on sensitivity of the SERS substrate to urine samples collected from diabetic patients and healthy persons. We report the preparation of a new type of SERS substrate, which provides fast (<10 s), highly sensitive (creatinine concentration <0.5 μg/mL) and reproducible (<5% variation) detection of urine. Our method to analyze the creatinine level in urine is in good agreement with the enzymatic method.

  15. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1975-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y a thin Formvar film or by extracting the sample onto ion exchange resin which is pressed into a pellet.

  16. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  17. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  18. qfasar: quantitative fatty acid signature analysis with R

    USGS Publications Warehouse

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  19. Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.

    PubMed

    Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C

    2012-08-07

    In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.

  20. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  1. Cerebellar dyssynergia in humans--a quantitative analysis.

    PubMed

    Miller, R G; Freund, H J

    1980-12-01

    Patients with cerebellar lesions and limb ataxia performed two types of continuous tracking tasks involving flexion and extension of the index finger. In both tasks, patients were provided cutaneous and proprioceptive cues, but visual feedback was given in the first task (visual tracking) and not in the second (arbitrarily termed proprioceptive tracking). Raw records and Fourier-analyzed power spectra were compared with results in normal controls. Harmonic distortion was determined for each task. In all patients, as well as normal subjects, tracking performance was markedly improved and harmonic distortion substantially reduced during proprioceptive tracking. This surprising finding may result from a much shorter feedback loop for proprioceptive stimuli compared to visual stimuli. The tracking records, power spectra analysis, and determination of harmonic distortion provide both qualitative and quantitative data in patients with dyssynergia.

  2. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  3. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  4. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  5. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  6. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  7. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  8. Energy efficiency quantitative analysis method of discrete manufacturing system

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Ji, Zhicheng

    2017-07-01

    The difficulty in the energy efficiency analysis of discrete manufacturing system is the lack of evaluation index system. In this paper, a novel evaluation index system with three layers and 10 indexes was presented to analyze the overall energy consumption level of the discrete manufacturing system. Then, with the consideration of the difficulties in directly obtaining machine energy efficiency, a prediction method based on recursive variable forgetting factor identification was put forward to calculate it. Furthermore, a comprehensive quantitative evaluation method of rough set and attribute hierarchical model was designed based on the index structure to evaluate the energy efficiency level. Finally, an experiment was used to illustrate the effectiveness of our evaluation index system and method.

  9. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  10. Quantitative surface-enhanced Raman spectroscopy for kinetic analysis of aldol condensation using Ag-Au core-shell nanocubes.

    PubMed

    Weatherston, Joshua D; Worstell, Nolan C; Wu, Hung-Jen

    2016-10-17

    Surface-enhanced Raman spectroscopy (SERS) is a powerful tool with high potential for multiplexed detection of dilute analytes. However, quantitative SERS of kinetic assays can be difficult due to the variation in enhancement factors caused by changing reaction conditions. We report a method for quantitative SERS kinetic analysis using colloidal Ag-Au core-shell nanocubes (Ag@AuNCs) as the SERS substrate. This substrate is mass producible, possesses large SERS enhancement, and is resistant to degradation in most environments. The SERS enhancement of the Ag@AuNCs was evaluated both experimentally and computationally. Quantitation was achieved by covalently attaching a non-reactive internal standard (IS) to substrate surfaces and normalizing SERS spectra to the IS signal. We demonstrated that IS normalization corrects for temporal variations in enhancement factor and particle concentration. Quantitation was demonstrated by monitoring the base-catalyzed aldol condensation of surface-bound 4-(methylthio)benzaldehyde with free acetone. The kinetic model of this reaction was fitted to IS normalized SERS data, resulting in kinetic parameters that agreed well with published values. This SERS platform is a robust and sensitive method for quantitative analysis of kinetic assays, with potential applications in many fields.

  11. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis

    PubMed Central

    Mazel, Christian; Mitulescu, Anca

    2007-01-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon–Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level’s degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon’s qualitative grading in 87% of cases. PMID:17216227

  12. Role of the orbital prefrontal cortex in choice between delayed and uncertain reinforcers: a quantitative analysis.

    PubMed

    Kheramin, S; Body, S; Ho, M -Y.; Velázquez-Martinez, D N.; Bradshaw, C M.; Szabadi, E; Deakin, J F.W.; Anderson, I M.

    2003-10-31

    'Inter-temporal choice' refers to choice between two or more outcomes that differ with respect to their sizes, delays, and/or probabilities of occurrence. According to the multiplicative hyperbolic model of inter-temporal choice, the value of a reinforcer increases as a hyperbolic function of its size, and decreases as a hyperbolic function of its delay and the odds against its occurrence. These functions, each of which contains a single discounting parameter, are assumed to combine multiplicatively to determine the overall value of the reinforcer. The model gives rise to a quantitative methodology for analysing inter-temporal choice, based on a family of linear null equations which describe performance under conditions of indifference, when the values of the reinforcers are assumed to be equal. This approach was used to examine the effect of lesions of the orbital prefrontal cortex (OPFC) on inter-temporal choice in rats. Under halothane anaesthesia, rats received injections of the excitotoxin quinolinate into the OPFC or sham lesions. They were trained to press two levers (A and B) for food-pellet reinforcers in discrete-trials schedules. In free-choice trials, a press on A resulted in delivery of a pellet after a delay d(A) with a probability P=0.5; a press on B resulted in delivery of a pellet with a probability P=1 after a delay d(B). d(B) was increased progressively across successive blocks of six trials in each session, while d(A) was manipulated systematically across phases of the experiment. The indifference delays, d(B(50)) (value of d(B) corresponding to 50% choice of B) was estimated for each rat in each phase. Linear functions of d(B(50)) versus d(A) were derived, and the parameters of the function compared between the groups. In both groups, d(B(50)) increased linearly with d(A). The slope of the linear function was significantly steeper in the lesioned group than in the sham-lesioned group, whereas the intercept did not differ significantly between

  13. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  14. Temporal-resolved characterization of laser-induced plasma for spectrochemical analysis of gas shales

    NASA Astrophysics Data System (ADS)

    Xu, Tao; Zhang, Yong; Zhang, Ming; He, Yi; Yu, Qiaoling; Duan, Yixiang

    2016-07-01

    Optical emission of laser ablation plasma on a shale target surface provides sensitive laser-induced breakdown spectrometry (LIBS) detection of major, minor or trace elements. An exploratory study for the characterization of the plasma induced on shale materials was carried out with the aim to trigger a crucial step towards the quantitative LIBS measurement. In this work, the experimental strategies that optimize the plasma generation on a pressed shale pellet surface are presented. The temporal evolution properties of the plasma induced by ns Nd:YAG laser pulse at the fundamental wavelength in air were investigated using time-resolved space-integrated optical emission spectroscopy. The electron density as well as the temperatures of the plasma were diagnosed as functions of the decay time for the bulk plasma analysis. In particular, the values of time-resolved atomic and ionic temperatures of shale elements, such as Fe, Mg, Ca, and Ti, were extracted from the well-known Boltzmann or Saha-Boltzmann plot method. Further comparison of these temperatures validated the local thermodynamic equilibrium (LTE) within specific interval of the delay time. In addition, the temporal behaviors of the signal-to-noise ratio of shale elements, including Si, Al, Fe, Ca, Mg, Ba, Li, Ti, K, Na, Sr, V, Cr, and Ni, revealed the coincidence of their maximum values with LIBS LTE condition in the time frame, providing practical implications for an optimized LIBS detection of shale elements. Analytical performance of LIBS was further evaluated with the linear calibration procedure for the most concerned trace elements of Sr, V, Cr, and Ni present in different shales. Their limits of detection obtained are elementally dependent and can be lower than tens of parts per million with the present LIBS experimental configurations. However, the occurrence of saturation effect for the calibration curve is still observable with the increasing trace element content, indicating that, due to the

  15. Quantitative analysis of EEG effects following experimental marginal magnesium and boron deprivation.

    PubMed

    Penland, J G

    1995-12-01

    Magnesium (115 and 315 mg/d) and boron (0.23 and 3.23 mg/d) were fed in a double-blind Latin squares design to 13 healthy postmenopausal women (aged 50-78 years) living on a metabolic unit. An eight-channel electroencephalogram (EEG) was recorded during the last week of each of four 6-week dietary periods. Power and coherence measures were determined for each of four EEG frequency bands: delta (1-3 Hz), theta (4-7 Hz), alpha (8-12 Hz), and beta (13-18 Hz). Compared to high dietary magnesium, the low magnesium intake increased total power in the frontal regions and right temporal and parietal regions and resulted in frequency-specific increases in left occipital delta power, theta power in all but the left temporal region, alpha power in the right frontal and right temporal regions, and beta power in the frontal regions. The proportion of theta to total power in the parietal regions also increased with the low magnesium intake. While magnesium effects were observed primarily during eyes-closed conditions, effects of dietary boron on EEG power were found only during eyes-open conditions. Relative to high dietary boron, the low boron intake increased delta power in the left parietal and left occipital regions, increased the proportion of delta to total power in the frontal regions, and decreased relative right frontal theta, right frontal alpha, and left frontal beta power. Additional magnesium and boron effects were evident in the measures of EEG coherence. Thus relatively short periods of marginal magnesium and boron deprivation can affect brain function in healthy older women. The findings extend previous qualitative observations of increased CNS activity following severe magnesium deprivation and deficiency to cases of experimentally induced marginal magnesium deficiency, and verify CNS hyperexcitability by quantitative analysis of the EEG.

  16. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  17. In vitro quantitative chemical analysis of tattoo pigments.

    PubMed

    Timko, A L; Miller, C H; Johnson, F B; Ross, E

    2001-02-01

    The composition of cosmetic tattoos might prove relevant to their treatment by high-powered lasers. To test the accuracy and completeness of information supplied by the tattoo ink manufacturers and to perform an elemental assay of tattoo pigments using scanning electron microscopy with energy-dispersive x-ray analysis. Samples of 30 tattoo inks were examined using "standardless" energy-dispersive spectrometry. This technique uses quantitative electron x-ray microanalysis. The technique reliably identifies all elements with the exception of those elements with atomic numbers less than 11. A major national referral laboratory for microscopic examination and biochemical analysis of tissue. These results were compared with ink compositions compiled from manufacturer-supplied material safety data sheets. (1) The percentage of any given element in whole tattoo pigments. (2) The presence or absence of elements and/or compounds as recorded in material safety data sheets supplied by the tattoo ink manufacturers. Of the 30 tattoo inks studied, the most commonly identified elements were aluminum (87% of the pigments), oxygen (73% of the pigments), titanium (67% of the pigments), and carbon (67% of the pigments). The relative contribution of elements to the tattoo ink compositions was highly variable between different compounds. Overall, the manufacturer-supplied data sheets were consistent with the elemental analysis, but there were important exceptions. The composition of elements in tattoo inks varies greatly, even among like-colored pigments. Knowledge of the chemical composition of popular tattoo inks might aid the clinician in effective laser removal.

  18. A Quantitative Analysis of the Solar Composition Problem

    NASA Astrophysics Data System (ADS)

    Villante, F. L.; Serenelli, A. M.

    We perform a quantitative analysis of the solar composition problem by using a statistical approach that allows us to combine the information provided by helioseismic and solar neutrino data in an effective way. We show that the opacity profile of the Sun is well constrained by the solar observational properties. In the context of a two parameter analysis in which elements are grouped as volatiles (i.e. C, N, O and Ne) and refractories (i.e. Mg, Si, S, Fe), the optimal surface composition is found by increasing the abundance of volatiles by (45 ± 4) % and that of refractories by (19 ± 3) % with respect to the values provided by Asplund et al., 2009. As an additional result of our analysis, we show that the best fit to the observational data is obtained with values of input parameters of the standard solar models (radiative opacities, gravitational settling rate, the astrophysical factors S 34 and S 17) that differ at the ∼ 1σ level from those presently adopted.

  19. [Quantitative analysis of drug expenditures variability in dermatology units].

    PubMed

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  20. QTL analysis for some quantitative traits in bread wheat*

    PubMed Central

    Pushpendra, Kumar Gupta; Harindra, Singh Balyan; Pawan, Laxminarayan Kulwal; Neeraj, Kumar; Ajay, Kumar; Reyazul, Rouf Mir; Amita, Mohan; Jitendra, Kumar

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection. PMID:17973342

  1. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  2. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    PubMed

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  3. A Temporal Examination of the Planktonic and Biofilm Proteome of Whole Cell Pseudomonas aeruginosa PAO1 Using Quantitative Mass Spectrometry*

    PubMed Central

    Park, Amber J.; Murphy, Kathleen; Krieger, Jonathan R.; Brewer, Dyanne; Taylor, Paul; Habash, Marc; Khursigara, Cezar M.

    2014-01-01

    Chronic polymicrobial lung infections are the chief complication in patients with cystic fibrosis. The dominant pathogen in late-stage disease is Pseudomonas aeruginosa, which forms recalcitrant, structured communities known as biofilms. Many aspects of biofilm biology are poorly understood; consequently, effective treatment of these infections is limited, and cystic fibrosis remains fatal. Here we combined in-solution protein digestion of triplicate growth-matched samples with a high-performance mass spectrometry platform to provide the most comprehensive proteomic dataset known to date for whole cell P. aeruginosa PAO1 grown in biofilm cultures. Our analysis included protein–protein interaction networks and PseudoCAP functional information for unique and significantly modulated proteins at three different time points. Secondary analysis of a subgroup of proteins using extracted ion currents validated the spectral counting data of 1884 high-confidence proteins. In this paper we demonstrate a greater representation of proteins related to metabolism, DNA stability, and molecular activity in planktonically grown P. aeruginosa PAO1. In addition, several virulence-related proteins were increased during planktonic growth, including multiple proteins encoded by the pyoverdine locus, uncharacterized proteins with sequence similarity to mammalian cell entry protein, and a member of the hemagglutinin family of adhesins, HecA. Conversely, biofilm samples contained an uncharacterized protein with sequence similarity to an adhesion protein with self-association characteristics (AidA). Increased levels of several phenazine biosynthetic proteins, an uncharacterized protein with sequence similarity to a metallo-beta-lactamase, and lower levels of the drug target gyrA support the putative characteristics of in situ P. aeruginosa infections, including competitive fitness and antibiotic resistance. This quantitative whole cell approach advances the existing P. aeruginosa

  4. Temporal dietary shift in jellyfish revealed by stable isotope analysis.

    PubMed

    Javidpour, Jamileh; Cipriano-Maack, Ashlie N; Mittermayr, Agnes; Dierking, Jan

    A temporal change in the stable isotope (SI) composition of jellyfish in the Kiel Fjord, Western Baltic Sea, was documented by analyzing δ(13)C, δ(15)N and δ(34)S of bell tissue of Aurelia aurita and Cyanea capillata in the period between June and October 2011. A strong and significant temporal change in all SI values of A. aurita was found, including an increase of ~3 ‰ in δ(13)C, a decrease of ~4 ‰ in δ(15)N and sharp decline of ~7 ‰ in δ(34)S. While knowledge gaps in jellyfish isotope ecology, in particular the lack of reliable trophic enrichment factors, call for a conservative interpretation of our data, observed changes in particular in δ(34)S, as indicated by means of a MixSIR mixing model, would be consistent with a temporal dietary shift in A. aurita from mesozooplankton (>150 µm) to microplankton and small re-suspended particles (0.8-20 µm) from the benthos. Presence of a hitherto unidentified food source not included in the model could also contribute to the shift. During the 2-month occurrence of C. capillata, its isotope composition remained stable and was consistent with a mainly mesozooplanktonic diet. Mixing model output, mainly driven by δ(34)S values, indicated a lower proportion of A. aurita in the diet of C. capillata than previously reported, and thus to a potentially lesser importance of intraguild predation among jellyfish in the Kiel Fjord. Overall, our results clearly highlighted the potential for substantial intraspecific isotopic seasonal variation in jellyfish, which should be taken into account in future feeding ecology studies on this group.

  5. Temporal and spectral aspects of coarticulation in ataxic dysarthria: an acoustic analysis.

    PubMed

    Hertrich, I; Ackermann, H

    1999-04-01

    In order to analyze the impact of cerebellar disorders on temporal and spectral aspects of coarticulation, 9 individuals with cerebellar dysfunction and 9 controls were asked to produce test sentences comprising a target vowel (V = [a], [i], or [u]) within a schwa-t-V-t-schwa environment. The control speakers were investigated both at their habitual speech tempo and under a slow speaking condition. The squared distances between averaged FFT spectra served as a quantitative estimate of target-induced coarticulation, a method that can be applied to consonants as well as vowels, and which avoids the shortcomings of formant analysis. In order to test the significance of coarticulation effects at the level of individual speakers and to obtain F values as a further measure of the strength of coarticulation, multivariate tests of target effects were performed, with the first 6 principal components derived from the spectra of each speaker. First, inconsistent patterns of anticipatory vowel-to-vowel (W) interactions emerged across individuals, and neither significant group differences nor any effects of speech rate could be detected. The underlying control mechanisms thus seem to be segment-linked subject-specific specifications of the pretarget vowel, uncompromised by cerebellar dysfunction. Second, all participants exhibited highly significant anticipatory [t(h)]-to-vowel (CV) coarticulation. This effect was slightly smaller in the cerebellar group than in the control group, which can be at least partially explained by reduced spectral distances among the 3 target vowels. Speech rate did not influence the CV effects of the control group. As concerns temporal aspects of coarticulation, no significant group differences emerged in terms of length adjustments of the pretarget consonant to the intrinsic duration of the target vowel. Third, ataxic speakers showed a tendency toward enlarged perseverative vowel-to-[t(h)] (VC) and W effects if their slow speech rate was taken into

  6. The Use of Temporal Information in Food Image Analysis

    PubMed Central

    He, Ye; Zhu, Fengqing; Boushey, Carol; Delp, Edward

    2016-01-01

    We have developed a dietary assessment system that uses food images captured by a mobile device. Food identification is a crucial component of our system. Achieving a high classification rates is challenging due to the large number of food categories and variability in food appearance. In this paper, we propose to improve food classification by incorporating temporal information. We employ recursive Bayesian estimation to incrementally learn from a person's eating history. We show an improvement of food classification accuracy by 11% can be achieved. PMID:28573258

  7. Analysis of temporal jitter in a copper vapor laser system

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Gantayet, L. M.; Singh, Sunita; Rawat, A. S.; Rana, Paramjit; V, Rajasree; Agarwalla, Sandeep K.; Chakravarthy, D. P.

    2012-02-01

    Temporal jitter in a magnetic pulse compression based copper vapor laser (CVL) system is analyzed by considering ripple present in the input dc power supply and ripple present in the magnetic core resetting power supply. It is shown that the jitter is a function of the ratio of operating voltage to the designed voltage, percentage ripple, and the total propagation delay of the magnetic pulse compression circuit. Experimental results from a CVL system operating at a repetition rate of 9 kHz are presented.

  8. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  9. Quantitative analysis of night skyglow amplification under cloudy conditions

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  10. Functional analysis and quantitative determination of the expression profile of human parvovirus B19.

    PubMed

    Bonvicini, Francesca; Filippone, Claudia; Manaresi, Elisabetta; Zerbini, Marialuisa; Musiani, Monica; Gallinella, Giorgio

    2008-11-25

    Comprehension of the pathogenetic potential of human parvovirus B19 requires the definition of the complete spectrum of cellular tropism and a functional analysis of the viral genome in infected cells. In this study, we carried out a systematic functional analysis of B19 virus genome in the course of infection of susceptible bone marrow mononuclear cells and myeloblastoid UT7/EpoS1 cells, in terms of dynamics of nucleic acid synthesis. A PCR array was designed and a comprehensive analysis was performed by quantitative PCR and RT-PCR, yielding extended information on the presence and abundance of the diverse classes of viral nucleic acids, on the temporal regulation of genome expression and on its relationship with the cell cycle. The analysis performed indicate that the synthesis of viral nucleic acids is correlated to the progression through the S phase of the cell cycle, that an extended pattern of transcriptional activity occurs throughout the course of infection, with a maximal rate of transcription preceding the onset of S-phase dependent replication of the viral genome, and that utilization of transcript processing signals is relatively constant throughout the course of infection. The information obtained led to the definition of a unified model of functional and expression profiling of parvovirus B19 genome.

  11. Spatio-temporal Linear Stability Analysis of Multiple Reacting Wakes

    NASA Astrophysics Data System (ADS)

    Kunnumpuram Sebastian, Jacob; Emerson, Benjamin; Lieuwen, Tim

    2016-11-01

    Hydrodynamic stability of reacting shear flows plays a key role in controlling a variety of combustor behaviors, such as combustion instability, mixing and entrainment, and blowoff. A significant literature exists on the hydrodynamics of single bluff body flows, but not the multi-bluff body flows that are found in applications. The objective of this work was to compare the spatio-temporal stability of multiple reacting wakes and single reacting wakes, within the framework of linear stability theory. Spatio-temporal stability analyses are conducted on model velocity and density profiles, with key parameters being the density ratio across the flame, bluff body spacing, dimensionless shear, and asymmetry parameters (if the two wakes are dissimilar). The introduction of the additional bluff body can exert both a stabilizing and destabilizing effect on the combined two-wake system, depending on the spatial separation between the bluff bodies. Furthermore, while the most rapidly amplified mode of the single wake mode is the sinuous (asymmetric) one, in the two wake system, the most rapidly amplified mode can be either sinuous or varicose (symmetric), also depending on spatial separation.

  12. Heart morphogenesis gene regulatory networks revealed by temporal expression analysis.

    PubMed

    Hill, Jonathon T; Demarest, Bradley; Gorsi, Bushra; Smith, Megan; Yost, H Joseph

    2017-10-01

    During embryogenesis the heart forms as a linear tube that then undergoes multiple simultaneous morphogenetic events to obtain its mature shape. To understand the gene regulatory networks (GRNs) driving this phase of heart development, during which many congenital heart disease malformations likely arise, we conducted an RNA-seq timecourse in zebrafish from 30 hpf to 72 hpf and identified 5861 genes with altered expression. We clustered the genes by temporal expression pattern, identified transcription factor binding motifs enriched in each cluster, and generated a model GRN for the major gene batteries in heart morphogenesis. This approach predicted hundreds of regulatory interactions and found batteries enriched in specific cell and tissue types, indicating that the approach can be used to narrow the search for novel genetic markers and regulatory interactions. Subsequent analyses confirmed the GRN using two mutants, Tbx5 and nkx2-5, and identified sets of duplicated zebrafish genes that do not show temporal subfunctionalization. This dataset provides an essential resource for future studies on the genetic/epigenetic pathways implicated in congenital heart defects and the mechanisms of cardiac transcriptional regulation. © 2017. Published by The Company of Biologists Ltd.

  13. 'MRI-negative PET-positive' temporal lobe epilepsy (TLE) and mesial TLE differ with quantitative MRI and PET: a case control study.

    PubMed

    Carne, Ross P; O'Brien, Terence J; Kilpatrick, Christine J; Macgregor, Lachlan R; Litewka, Lucas; Hicks, Rodney J; Cook, Mark J

    2007-06-24

    'MRI negative PET positive temporal lobe epilepsy' represents a substantial minority of temporal lobe epilepsy (TLE). Clinicopathological and qualitative imaging differences from mesial temporal lobe epilepsy are reported. We aimed to compare TLE with hippocampal sclerosis (HS+ve) and non lesional TLE without HS (HS-ve) on MRI, with respect to quantitative FDG-PET and MRI measures. 30 consecutive HS-ve patients with well-lateralised EEG were compared with 30 age- and sex-matched HS+ve patients with well-lateralised EEG. Cerebral, cortical lobar and hippocampal volumetric and co-registered FDG-PET metabolic analyses were performed. There was no difference in whole brain, cerebral or cerebral cortical volumes. Both groups showed marginally smaller cerebral volumes ipsilateral to epileptogenic side (HS-ve 0.99, p = 0.02, HS+ve 0.98, p < 0.001). In HS+ve, the ratio of epileptogenic cerebrum to whole brain volume was less (p = 0.02); the ratio of epileptogenic cerebral cortex to whole brain in the HS+ve group approached significance (p = 0.06). Relative volume deficits were seen in HS+ve in insular and temporal lobes. Both groups showed marked ipsilateral hypometabolism (p < 0.001), most marked in temporal cortex. Mean hypointensity was more marked in epileptogenic-to-contralateral hippocampus in HS+ve (ratio: 0.86 vs 0.95, p < 0.001). The mean FDG-PET ratio of ipsilateral to contralateral cerebral cortex however was low in both groups (ratio: HS-ve 0.97, p < 0.0001; HS+ve 0.98, p = 0.003), and more marked in HS-ve across all lobes except insula. Overall, HS+ve patients showed more hippocampal, but also marginally more ipsilateral cerebral and cerebrocortical atrophy, greater ipsilateral hippocampal hypometabolism but similar ipsilateral cerebral cortical hypometabolism, confirming structural and functional differences between these groups.

  14. The quantitative electroencephalogram and the low-resolution electrical tomographic analysis in posttraumatic stress disorder.

    PubMed

    Todder, Doran; Levine, Joseph; Abujumah, Ahmad; Mater, Michael; Cohen, Hagit; Kaplan, Zeev

    2012-01-01

    The electroencephalogram (EEG) is the recording of the brain electrical activity as measured on the scalp. Using mathematical algorithms, the 3-dimensional (3D) distribution of the electrical potential inside the brain can be calculated. One of the methods to calculate it is the low-resolution electrical tomographic analysis (LORETA). In this research, we seek to find the brain structures that differentiate patients with posttraumatic stress disorder (PTSD) from controls. Ten right-handed consenting adult male patients were recruited from a PTSD clinic. All patients fulfilled Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition, Text Revision [DSM-IV-TR]) criteria for chronic PTSD (duration >2 years.) and were on drug treatment regimens that had been stable for at least 2 months (involving only serotonin reuptake inhibitors [SSRIs] and benzodiazepines).The control group consisted of 10 healthy hospital staff members. All study participants underwent 19 channel EEG measurements according to current standards of practice. All artifact-free EEG strips were examined for spectral as well as LORETA analysis focusing on the theta (4-7 Hz) band which is suggested to reflect the activity of the limbic system. The theta band showed a statistically significant difference (P < .05) between the 2 groups in the right temporal lobe and in both the right and left frontal lobes. Our findings support existing research data obtained via other imaging technologies, which demonstrated structural alterations in the right temporal and frontal areas in PTSD. These results indicate that combining quantitative EEG (QEEG) and the LORETA method, among other methods, may improve the neuroanatomical resolution of EEG data analysis.

  15. Quantitative analysis and parametric display of regional myocardial mechanics

    NASA Astrophysics Data System (ADS)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  16. Quantitative 3-dimensional computed tomography analysis of olecranon fractures.

    PubMed

    Lubberts, Bart; Janssen, Stein; Mellema, Jos; Ring, David

    2016-05-01

    Olecranon fractures have variable size of the proximal fragment, patterns of fragmentation, and subluxation of the ulnohumeral joint that might be better understood and categorized on the basis of quantitative 3-dimensional computed tomography analysis. Mayo type I fractures are undisplaced, Mayo type II are displaced and stable, and Mayo type III are displaced and unstable. The last is categorized into anterior and posterior dislocations. The purpose of this study was to further clarify fracture morphology between Mayo type I, II, and III fractures. Three-dimensional models were created for a consecutive series of 78 patients with olecranon fractures that were evaluated with computed tomography. We determined the total number of fracture fragments, the volume and articular surface area of each fracture fragment, and the degree of displacement of the most proximal olecranon fracture fragment. Displaced olecranon fractures were more comminuted than nondisplaced fractures (P = .02). Displaced fractures without ulnohumeral subluxation were smallest in terms of both volume (P < .001) and articular surface involvement (P < .001) of the most proximal olecranon fracture fragment. There was no difference in average displacement of the proximal fragment between displaced fractures with and without ulnohumeral subluxation (P = .74). Anterior olecranon fracture-dislocations created more displaced (P = .04) and smaller proximal fragments than posterior fracture-dislocations (P = .005), with comparable fragmentation on average (P = .60). The ability to quantify volume, articular surface area, displacement, and fragmentation using quantitative 3-dimensional computed tomography should be considered when increased knowledge of fracture morphology and fracture patterns might be useful. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  17. Towards quantitative analysis of retinal features in optical coherence tomography.

    PubMed

    Baroni, Maurizio; Fortunato, Pina; La Torre, Agostino

    2007-05-01

    The purpose of this paper was to propose a new computer method for quantitative evaluation of representative features of the retina using optical coherence tomography (OCT). A multi-step approach was devised and positively tested for segmentation of the three main retinal layers: the vitreo-retinal interface and the inner and outer retina. Following a preprocessing step, three regions of interest were delimited. Significant peaks corresponding to high and low intensity strips were located along the OCT A-scan lines and accurate boundaries between different layers were obtained by maximizing an edge likelihood function. For a quantitative description, thickness measurement, densitometry, texture and curvature analyses were performed. As a first application, the effect of intravitreal injection of triamcinolone acetonide (IVTA) for the treatment of vitreo-retinal interface syndrome was evaluated. Almost all the parameters, measured on a set of 16 pathologic OCT images, were statistically different before and after IVTA injection (p<0.05). Shape analysis of the internal limiting membrane confirmed the reduction of the pathological traction state. Other significant parameters, such as reflectivity and texture contrast, exhibited relevant changes both at the vitreo-retinal interface and in the inner retinal layers. Texture parameters in the inner and outer retinal layers significantly correlated with the visual acuity restoration. According to these findings an IVTA injection might be considered a possible alternative to surgery for selected patients. In conclusion, the proposed approach appeared to be a promising tool for the investigation of tissue changes produced by pathology and/or therapy.

  18. Quantitative analysis of drug-induced tremor in mice.

    PubMed

    Shinozaki, H

    1984-12-01

    A method of analyzing tremor in mice was developed using a power spectral analysis of the random current induced by the movement of a magnet attached to a mouse, on a wire coil. The power spectral density function defined the frequency composition of the tremor, and the mean square value of the data in any frequency range of concern was determined. It was possible to determine qualitative differences in the tremor caused by various tremorgenic agents, and to differentiate the drug-induced tremor from spontaneous motor activity. The power spectral densities of the tremorine- and oxotremorine-induced tremors were tentatively expressed as the sum of 3 main components (Cauchy distribution) with different peak frequencies, consisting of the spontaneous motor activity component and two tremor components. On the other hand, the power spectral densities of the harmaline-induced tremor were expressed as the sum of two components with two peak frequencies, and the plots of the power spectral densities versus frequency, consisting of the spontaneous motor activity component and a tremor component. The frequency of the peak spectral density was almost independent of the dose of tremorgenic agents, but changed slightly with the lapse of time after their injection. The severity of the tremor was determined quantitatively in terms of the sum of the mean square value. The sum of the mean square value for a period of 45 min after the injection of tremorine, changed in a dose-dependent manner. The severity of the tremor was different for each of the mouse strains. The method studied in the present paper is expected to be utilized for the quantitative examination of the fine motor movement of the experimental animal, particularly, for the screening test of new anti-tremor drugs.

  19. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    NASA Astrophysics Data System (ADS)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  20. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  1. Quantitative analysis of bloggers' collective behavior powered by emotions

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Paltoglou, Georgios; Tadić, Bosiljka

    2011-02-01

    Large-scale data resulting from users' online interactions provide the ultimate source of information to study emergent social phenomena on the Web. From individual actions of users to observable collective behaviors, different mechanisms involving emotions expressed in the posted text play a role. Here we combine approaches of statistical physics with machine-learning methods of text analysis to study the emergence of emotional behavior among Web users. Mapping the high-resolution data from digg.com onto bipartite networks of users and their comments onto posted stories, we identify user communities centered around certain popular posts and determine emotional contents of the related comments by the emotion classifier developed for this type of text. Applied over different time periods, this framework reveals strong correlations between the excess of negative emotions and the evolution of communities. We observe avalanches of emotional comments exhibiting significant self-organized critical behavior and temporal correlations. To explore the robustness of these critical states, we design a network-automaton model on realistic network connections and several control parameters, which can be inferred from the dataset. Dissemination of emotions by a small fraction of very active users appears to critically tune the collective states.

  2. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  3. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    PubMed

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  4. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  5. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    DOE PAGES

    Xu, Zhe; Wu, Chaochao; Xie, Fang; ...

    2014-10-28

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less

  6. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  7. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  8. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    PubMed

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  9. Correlation between two methods of florbetapir PET quantitative analysis.

    PubMed

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  10. Quantitative Analysis of Intracellular Motility Based on Optical Flow Model

    PubMed Central

    Li, Heng

    2017-01-01

    Analysis of cell mobility is a key issue for abnormality identification and classification in cell biology research. However, since cell deformation induced by various biological processes is random and cell protrusion is irregular, it is difficult to measure cell morphology and motility in microscopic images. To address this dilemma, we propose an improved variation optical flow model for quantitative analysis of intracellular motility, which not only extracts intracellular motion fields effectively but also deals with optical flow computation problem at the border by taking advantages of the formulation based on L1 and L2 norm, respectively. In the energy functional of our proposed optical flow model, the data term is in the form of L2 norm; the smoothness of the data changes with regional features through an adaptive parameter, using L1 norm near the edge of the cell and L2 norm away from the edge. We further extract histograms of oriented optical flow (HOOF) after optical flow field of intracellular motion is computed. Then distances of different HOOFs are calculated as the intracellular motion features to grade the intracellular motion. Experimental results show that the features extracted from HOOFs provide new insights into the relationship between the cell motility and the special pathological conditions.

  11. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  12. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  13. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  14. A Computational Tool for Quantitative Analysis of Vascular Networks

    PubMed Central

    Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

    2011-01-01

    Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time - and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called “branching index” (branch points / unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

  15. Quantitative analysis of triple-mutant genetic interactions.

    PubMed

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E; Wu, Qiuqin; Haber, James E; Krogan, Nevan J

    2014-08-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven to be effective for characterizing cellular functions, but it can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed triple-mutant analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, which is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principal actors are deleted. TMA has also uncovered double-mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete and measures interactions for up to 30 double mutants against a library of 1,536 single mutants.

  16. A spatio-temporal analysis of suicide in El Salvador.

    PubMed

    Carcach, Carlos

    2017-04-20

    In 2012, international statistics showed El Salvador's suicide rate as 40th in the world and the highest in Latin America. Over the last 15 years, national statistics show the suicide death rate declining as opposed to an increasing rate of homicide. Though completed suicide is an important social and health issue, little is known about its prevalence, incidence, etiology and spatio-temporal behavior. The primary objective of this study was to examine completed suicide and homicide using the stream analogy to lethal violence within a spatio-temporal framework. A Bayesian model was applied to examine the spatio-temporal evolution of the tendency of completed suicide over homicide in El Salvador. Data on numbers of suicides and homicides at the municipal level were obtained from the Instituto de Medicina Legal (IML) and population counts, from the Dirección General de Estadística y Censos (DIGESTYC), for the period of 2002 to 2012. Data on migration were derived from the 2007 Population Census, and inequality data were obtained from a study by Damianović, Valenzuela and Vera. The data reveal a stable standardized rate of total lethal violence (completed suicide plus homicide) across municipalities over time; a decline in suicide; and a standardized suicide rate decreasing with income inequality but increasing with social isolation. Municipalities clustered in terms of both total lethal violence and suicide standardized rates. Spatial effects for suicide were stronger among municipalities located in the north-east and center-south sides of the country. New clusters of municipalities with large suicide standardized rates were detected in the north-west, south-west and center-south regions, all of which are part of time-stable clusters of homicide. Prevention efforts to reduce income inequality and mitigate the negative effects of weak relational systems should focus upon municipalities forming time-persistent clusters with a large rate of death by suicide. In

  17. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  18. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  19. Temporal Stability of Implicit and Explicit Measures: A Longitudinal Analysis.

    PubMed

    Gawronski, Bertram; Morrison, Mike; Phills, Curtis E; Galdi, Silvia

    2017-03-01

    A common assumption about implicit measures is that they reflect early experiences, whereas explicit measures are assumed to reflect recent experiences. This assumption subsumes two distinct hypotheses: (a) Implicit measures are more resistant to situationally induced changes than explicit measures; (b) individual differences on implicit measures are more stable over time than individual differences on explicit measures. Although the first hypothesis has been the subject of numerous studies, the second hypothesis has received relatively little attention. The current research addressed the second hypothesis in two longitudinal studies that compared the temporal stability of individual differences on implicit and explicit measures in three content domains (self-concept, racial attitudes, political attitudes). In both studies, implicit measures showed significantly lower stability over time (weighted average r = .54) than conceptually corresponding explicit measures (weighted average r = .75), despite comparable estimates of internal consistency. Implications for theories of implicit social cognition and interpretations of implicit and explicit measures are discussed.

  20. Multi-scale spatio-temporal analysis of human mobility.

    PubMed

    Alessandretti, Laura; Sapiezynski, Piotr; Lehmann, Sune; Baronchelli, Andrea

    2017-01-01

    The recent availability of digital traces generated by phone calls and online logins has significantly increased the scientific understanding of human mobility. Until now, however, limited data resolution and coverage have hindered a coherent description of human displacements across different spatial and temporal scales. Here, we characterise mobility behaviour across several orders of magnitude by analysing ∼850 individuals' digital traces sampled every ∼16 seconds for 25 months with ∼10 meters spatial resolution. We show that the distributions of distances and waiting times between consecutive locations are best described by log-normal and gamma distributions, respectively, and that natural time-scales emerge from the regularity of human mobility. We point out that log-normal distributions also characterise the patterns of discovery of new places, implying that they are not a simple consequence of the routine of modern life.

  1. Multi-scale spatio-temporal analysis of human mobility

    PubMed Central

    Alessandretti, Laura; Sapiezynski, Piotr; Lehmann, Sune; Baronchelli, Andrea

    2017-01-01

    The recent availability of digital traces generated by phone calls and online logins has significantly increased the scientific understanding of human mobility. Until now, however, limited data resolution and coverage have hindered a coherent description of human displacements across different spatial and temporal scales. Here, we characterise mobility behaviour across several orders of magnitude by analysing ∼850 individuals’ digital traces sampled every ∼16 seconds for 25 months with ∼10 meters spatial resolution. We show that the distributions of distances and waiting times between consecutive locations are best described by log-normal and gamma distributions, respectively, and that natural time-scales emerge from the regularity of human mobility. We point out that log-normal distributions also characterise the patterns of discovery of new places, implying that they are not a simple consequence of the routine of modern life. PMID:28199347

  2. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    NASA Astrophysics Data System (ADS)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  3. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  4. Qualitative and quantitative analysis of atmospheric organosulfates in Centreville, Alabama

    NASA Astrophysics Data System (ADS)

    Hettiyadura, Anusha P. S.; Jayarathne, Thilina; Baumann, Karsten; Goldstein, Allen H.; de Gouw, Joost A.; Koss, Abigail; Keutsch, Frank N.; Skog, Kate; Stone, Elizabeth A.

    2017-01-01

    Organosulfates are components of secondary organic aerosols (SOA) that form from oxidation of volatile organic compounds (VOCs) in the presence of sulfate. In this study, the composition and abundance of organosulfates were determined in fine particulate matter (PM2.5) collected from Centreville, AL, during the Southern Oxidant and Aerosol Study (SOAS) in summer 2013. Six organosulfates were quantified using hydrophilic interaction liquid chromatography (HILIC) with triple quadrupole mass spectrometry (TQD) against authentic standards. Among these, the three most abundant species were glycolic acid sulfate (0.5-52.5 ng m-3), lactic acid sulfate (0.5-36.7 ng m-3), and hydroxyacetone sulfate (0.5-14.3 ng m-3). These three species were strongly inter-correlated, suggesting similar precursors and/or formation pathways. Further correlations with sulfate, isoprene, and isoprene oxidation products indicate important roles for these precursors in organosulfate formation in Centreville. Positive filter sampling artifacts associated with these organosulfates due to gas adsorption or reaction of gas phase precursors of organosulfates with sulfuric acid were assessed for a subset of samples and were less than 7.8 % of their PM2.5 concentrations. Together, the quantified organosulfates accounted for < 0.3 % of organic carbon mass in PM2.5. To gain insights into other organosulfates in PM2.5 collected from Centreville, semi-quantitative analysis was employed by way of monitoring characteristic product ions of organosulfates (HSO4- at m/z 97 and SO4- ṡ at m/z 96) and evaluating relative signal strength by HILIC-TQD. Molecular formulas of organosulfates were determined by high-resolution time-of-flight (TOF) mass spectrometry. The major organosulfate signal across all samples corresponded to 2-methyltetrol sulfates, which accounted for 42-62 % of the total bisulfate ion signal. Conversely, glycolic acid sulfate, the most abundant organosulfate quantified in this study, was 0

  5. Dynamics and spatio-temporal variability of environmental factors in Eastern Australia using functional principal component analysis

    USGS Publications Warehouse

    Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.

    2010-01-01

    This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.

  6. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  7. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    PubMed

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  8. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  9. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    PubMed

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  10. Temporal Changes in Sequential Quantitative Thallium-201 Imaging Following Myocardial Infarction in Dogs: Comparison of Four and Twenty-Four Hour Infarct Images

    PubMed Central

    Umbach, Richard E.; Lange, Robert C.; Lee, John C.; Zaret, Barry L.

    1978-01-01

    Thallium-201 (201T1) myocardial perfusion imaging allows definition of zones of myocardial infarction and ischemia. The temporal changes in sequential quantitative 201T1 infarct imaging was studied 4 and 24 hours in dogs subjected to closed-chest anterior wall myocardial infarction. A temporal decrease in 201T1 imaged infarct areas was noted in 10 of 13 animals. In no animal did the infarct area increase. The imaged infarct area decreased by an average of 30% from 12.9 ± 6.2 cm2 at 4 hours to 9.1 ± 5.1 cm2 at 24 hours (p < 0.001), and involved 34 ± 16% of the total 201T1 left ventricular distribution at 4 hours and 22 ± 14% at 24 hours (p < 0.001). The magnitude of temporal change in imaged infarct area was not predicted by initial image defect or final histopathologic infarct size. Thus, the results of 201T1 infarct imaging in the early period of infarction are clearly dependent upon the time at which the procedure is performed. ImagesFIG. 1 PMID:752198

  11. Quantitative risk analysis for landslides -- Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2004-03-01

    Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative landslide risk analysis the

  12. Temporal and Spatial Analysis of Monogenetic Volcanic Fields

    NASA Astrophysics Data System (ADS)

    Kiyosugi, Koji

    Achieving an understanding of the nature of monogenetic volcanic fields depends on identification of the spatial and temporal patterns of volcanism in these fields, and their relationships to structures mapped in the shallow crust and inferred in the deep crust and mantle through interpretation of geochemical, radiometric and geophysical data. We investigate the spatial and temporal distributions of volcanism in the Abu Monogenetic Volcano Group, Southwest Japan. E-W elongated volcano distribution, which is identified by a nonparametric kernel method, is found to be consistent with the spatial extent of P-wave velocity anomalies in the lower crust and upper mantle, supporting the idea that the spatial density map of volcanic vents reflects the geometry of a mantle diapir. Estimated basalt supply to the lower crust is constant. This observation and the spatial distribution of volcanic vents suggest stability of magma productivity and essentially constant two-dimensional size of the source mantle diapir. We mapped conduits, dike segments, and sills in the San Rafael sub-volcanic field, Utah, where the shallowest part of a Pliocene magmatic system is exceptionally well exposed. The distribution of conduits matches the major features of dike distribution, including development of clusters and distribution of outliers. The comparison of San Rafael conduit distribution and the distributions of volcanoes in several recently active volcanic fields supports the use of statistical models, such as nonparametric kernel methods, in probabilistic hazard assessment for distributed volcanism. We developed a new recurrence rate calculation method that uses a Monte Carlo procedure to better reflect and understand the impact of uncertainties of radiometric age determinations on uncertainty of recurrence rate estimates for volcanic activity in the Abu, Yucca Mountain Region, and Izu-Tobu volcanic fields. Results suggest that the recurrence rates of volcanic fields can change by more

  13. Brain networks of temporal preparation: A multiple regression analysis of neuropsychological data.

    PubMed

    Triviño, Mónica; Correa, Ángel; Lupiáñez, Juan; Funes, María Jesús; Catena, Andrés; He, Xun; Humphreys, Glyn W

    2016-11-15

    There are only a few studies on the brain networks involved in the ability to prepare in time, and most of them followed a correlational rather than a neuropsychological approach. The present neuropsychological study performed multiple regression analysis to address the relationship between both grey and white matter (measured by magnetic resonance imaging in patients with brain lesion) and different effects in temporal preparation (Temporal orienting, Foreperiod and Sequential effects). Two versions of a temporal preparation task were administered to a group of 23 patients with acquired brain injury. In one task, the cue presented (a red versus green square) to inform participants about the time of appearance (early versus late) of a target stimulus was blocked, while in the other task the cue was manipulated on a trial-by-trial basis. The duration of the cue-target time intervals (400 versus 1400ms) was always manipulated within blocks in both tasks. Regression analysis were conducted between either the grey matter lesion size or the white matter tracts disconnection and the three temporal preparation effects separately. The main finding was that each temporal preparation effect was predicted by a different network of structures, depending on cue expectancy. Specifically, the Temporal orienting effect was related to both prefrontal and temporal brain areas. The Foreperiod effect was related to right and left prefrontal structures. Sequential effects were predicted by both parietal cortex and left subcortical structures. These findings show a clear dissociation of brain circuits involved in the different ways to prepare in time, showing for the first time the involvement of temporal areas in the Temporal orienting effect, as well as the parietal cortex in the Sequential effects.

  14. Quantitative texture analysis of talc in mantle hydrated mylonites

    NASA Astrophysics Data System (ADS)

    Benitez-Perez, J. M.; Gomez Barreiro, J.; Wenk, H. R.; Vogel, S. C.; Soda, Y.; Voltolini, M.; Martinez-Catalan, J. R.

    2014-12-01

    A quantitative texture analysis of talc-serpentinite mylonites developed in highly deformed ultramafic rocks from different orogenic contexts have been done with neutorn diffraction at HIPPO (Los Álamos National Laboratory). Mineral assemblage, metamorphic evolution and deformative fabric of these samples could be correlated with those verified along the shallow levels (<100km; <5GPa) of a subduction zone. The hydration of mantle (ultramafic) rocks at those levels it is likely to occur dynamically, with important implications on seismogenesis. Given the high anisotropy of the major phases in the samples (i.e. talc and antigorite) it is expected to influence seismic anisotropy of the whole system, in the presence of texture. However to date there was no data on the crystallographic preferred orientation of talc and examples of antigorite textures are very limited. We explore the contribution of talc texture to the seismic anisotropy of mantle hydrated mylonites. Acknowledgements: This work has been funded by research project CGL2011-22728 of Spanish Ministry of Economy and Competitiveness. JGB and JMBP are grateful to the Ramón y Cajal and FPI funding programs. Access to HIPPO (LANSCE) to conduct diffraction experiments is kindly acknowledged.

  15. Quantitative analysis of dynamic association in live biological fluorescent samples.

    PubMed

    Ruusuvuori, Pekka; Paavolainen, Lassi; Rutanen, Kalle; Mäki, Anita; Huttunen, Heikki; Marjomäki, Varpu

    2014-01-01

    Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  16. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    NASA Astrophysics Data System (ADS)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  17. A quantitative analysis of cardiac myocyte relaxation: a simulation study.

    PubMed

    Niederer, S A; Hunter, P J; Smith, N P

    2006-03-01

    The determinants of relaxation in cardiac muscle are poorly understood, yet compromised relaxation accompanies various pathologies and impaired pump function. In this study, we develop a model of active contraction to elucidate the relative importance of the [Ca2+]i transient magnitude, the unbinding of Ca2+ from troponin C (TnC), and the length-dependence of tension and Ca2+ sensitivity on relaxation. Using the framework proposed by one of our researchers, we extensively reviewed experimental literature, to quantitatively characterize the binding of Ca2+ to TnC, the kinetics of tropomyosin, the availability of binding sites, and the kinetics of crossbridge binding after perturbations in sarcomere length. Model parameters were determined from multiple experimental results and modalities (skinned and intact preparations) and model results were validated against data from length step, caged Ca2+, isometric twitches, and the half-time to relaxation with increasing sarcomere length experiments. A factorial analysis found that the [Ca2+]i transient and the unbinding of Ca2+ from TnC were the primary determinants of relaxation, with a fivefold greater effect than that of length-dependent maximum tension and twice the effect of tension-dependent binding of Ca2+ to TnC and length-dependent Ca2+ sensitivity. The affects of the [Ca2+]i transient and the unbinding rate of Ca2+ from TnC were tightly coupled with the effect of increasing either factor, depending on the reference [Ca2+]i transient and unbinding rate.

  18. Quantitative and graphic acoustic analysis of phonatory modulations: the modulogram.

    PubMed

    Buder, Eugene H; Strand, Edythe A

    2003-04-01

    A method is presented for analyzing phonatory instabilities that occur as modulations of fundamental frequency (f0) and sound pressure level (SPL) on the order of 0.2 to 20 cycles per second. Such long-term phonatory instabilities, including but not limited to traditional notions of tremor, are distinct from cycle-to-cycle perturbation such as jitter or shimmer. For each of the 2 parameters (f0, in Hz, and SPL, in dB), 3 frequency domains are proposed: (a) flutter (10-20 Hz), (b) tremor (2-10 Hz), and (c) wow (0.2-2.0 Hz), yielding 6 types of instability. Analyses were implemented using fast Fourier transforms (FFTs) with domain-specific analysis parameters. Outputs include a graphic display in the form of a set of low-frequency spectrograms (the "modulogram") and quantitative measures of the frequencies, magnitudes, durations, and sinusoidal form of the instabilities. An index of a given instability is developed by combining its duration and average modulation magnitude into a single quantity. Performance of the algorithms was assessed by analyzing test signals with known degrees of modulation, and a range of applications was reviewed to provide a rationale for use of modulograms in phonatory assessment.

  19. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-07

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene.

  20. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  1. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  2. Separation and quantitative analysis of alkyl sulfate ethoxymers by HPLC.

    PubMed

    Morvan, Julien; Hubert-Roux, Marie; Agasse, Valérie; Cardinael, Pascal; Barbot, Florence; Decock, Gautier; Bouillon, Jean-Philippe

    2008-01-01

    Separation of alkyl sulfate ethoxymers is investigated on various high-performance liquid chromatography (HPLC) stationary phases: Acclaim C18 Surfactant, Surfactant C8, and Hypercarb. For a fixed alkyl chain length, ethoxymers are eluted in the order of increasing number of ethoxylated units on Acclaim C18 Surfactant, whereas a reversed elution order is observed on Surfactant C8 and Hypercarb. Moreover, on an Acclaim C18 Surfactant column, non-ethoxylated compounds are eluted in their ethoxymers distribution and the use of sodium acetate additive in mobile phase leads to a co-elution of ethoxymers. HPLC stationary phases dedicated to surfactants analysis are evaluated by means of the Tanaka test. Surfactant C8 presents a great silanol activity whereas Acclaim C18 Surfactant shows a high steric selectivity. For alkyl sulfates, linearity of the calibration curve and limits of detection and quantitation are evaluated. The amount of sodium laureth sulfate raw material found in commercial body product is in agreement with the specification of the manufacturer.

  3. Quantitative analysis of regulatory flexibility under changing environmental conditions

    PubMed Central

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  4. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    PubMed

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  5. Quantitative analysis of cellular metabolic dissipative, self-organized structures.

    PubMed

    de la Fuente, Ildefonso Martínez

    2010-09-27

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life.

  6. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  7. Assessment of hair surface roughness using quantitative image analysis.

    PubMed

    Park, K H; Kim, H J; Oh, B; Lee, E; Ha, J

    2017-07-19

    Focus on the hair and hair cuticle is increasing. The hair cuticle is the first layer to be exposed to damage and the area of primary protection. For such reasons, hair product manufacturers consider cuticle protection important. However, previous studies used only visual assessment to examine the cuticle. This study aimed to obtain the changes in cuticles and measure hair roughness using a HIROX microscope. A total of 23 female subjects used the same products daily for 4 weeks. Three hair samples per subject were collected from three different areas of the head. Measurements were taken before and after 4 weeks of daily product use. The hair surface changes were clearly observed on the captured images. Moreover, hair surface roughness was observed using various parameters on HIROX software. After 4 weeks of daily product use, the roughness parameter value of the hair surface was significantly decreased. Our result suggests that the hair roughness analytical method using HIROX can be a new paradigm for high-quality quantitative analysis of the hair cuticle. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  9. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  10. High throughput, quantitative analysis of human osteoclast differentiation and activity.

    PubMed

    Diepenhorst, Natalie A; Nowell, Cameron J; Rueda, Patricia; Henriksen, Kim; Pierce, Tracie; Cook, Anna E; Pastoureau, Philippe; Sabatini, Massimo; Charman, William N; Christopoulos, Arthur; Summers, Roger J; Sexton, Patrick M; Langmead, Christopher J

    2017-02-15

    Osteoclasts are multinuclear cells that degrade bone under both physiological and pathophysiological conditions. Osteoclasts are therefore a major target of osteoporosis therapeutics aimed at preserving bone. Consequently, analytical methods for osteoclast activity are useful for the development of novel biomarkers and/or pharmacological agents for the treatment of osteoporosis. The nucleation state of an osteoclast is indicative of its maturation and activity. To date, activity is routinely measured at the population level with only approximate consideration of the nucleation state (an 'osteoclast population' is typically defined as cells with ≥3 nuclei). Using a fluorescent substrate for tartrate-resistant acid phosphatase (TRAP), a routinely used marker of osteoclast activity, we developed a multi-labelled imaging method for quantitative measurement of osteoclast TRAP activity at the single cell level. Automated image analysis enables interrogation of large osteoclast populations in a high throughput manner using open source software. Using this methodology, we investigated the effects of receptor activator of nuclear factor kappa-B ligand (RANK-L) on osteoclast maturation and activity and demonstrated that TRAP activity directly correlates with osteoclast maturity (i.e. nuclei number). This method can be applied to high throughput screening of osteoclast-targeting compounds to determine changes in maturation and activity.

  11. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  12. Application of 3D Spatio-Temporal Data Modeling, Management, and Analysis in DB4GEO

    NASA Astrophysics Data System (ADS)

    Kuper, P. V.; Breunig, M.; Al-Doori, M.; Thomsen, A.

    2016-10-01

    Many of todaýs world wide challenges such as climate change, water supply and transport systems in cities or movements of crowds need spatio-temporal data to be examined in detail. Thus the number of examinations in 3D space dealing with geospatial objects moving in space and time or even changing their shapes in time will rapidly increase in the future. Prominent spatio-temporal applications are subsurface reservoir modeling, water supply after seawater desalination and the development of transport systems in mega cities. All of these applications generate large spatio-temporal data sets. However, the modeling, management and analysis of 3D geo-objects with changing shape and attributes in time still is a challenge for geospatial database architectures. In this article we describe the application of concepts for the modeling, management and analysis of 2.5D and 3D spatial plus 1D temporal objects implemented in DB4GeO, our service-oriented geospatial database architecture. An example application with spatio-temporal data of a landfill, near the city of Osnabrück in Germany demonstrates the usage of the concepts. Finally, an outlook on our future research focusing on new applications with big data analysis in three spatial plus one temporal dimension in the United Arab Emirates, especially the Dubai area, is given.

  13. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  14. Quantitative Analysis of Intracellular Fluorescent Foci in Live Bacteria

    PubMed Central

    Moolman, M. Charl; Kerssemakers, Jacob W.J.; Dekker, Nynke H.

    2015-01-01

    Fluorescence microscopy has revolutionized in vivo cellular biology. Through the specific labeling of a protein of interest with a fluorescent protein, one is able to study movement and colocalization, and even count individual proteins in a live cell. Different algorithms exist to quantify the total intensity and position of a fluorescent focus. Although these algorithms have been rigorously studied for in vitro conditions, which are greatly different than the in-homogenous and variable cellular environments, their exact limits and applicability in the context of a live cell have not been thoroughly and systematically evaluated. In this study, we quantitatively characterize the influence of different background subtraction algorithms on several focus analysis algorithms. We use, to our knowledge, a novel approach to assess the sensitivity of the focus analysis algorithms to background removal, in which simulated and experimental data are combined to maintain full control over the sensitivity of a focus within a realistic background of cellular fluorescence. We demonstrate that the choice of algorithm and the corresponding error are dependent on both the brightness of the focus, and the cellular context. Expectedly, focus intensity estimation and localization accuracy suffer in all algorithms at low focus to background ratios, with the bacteroidal background subtraction in combination with the median excess algorithm, and the region of interest background subtraction in combination with a two-dimensional Gaussian fit algorithm, performing the best. We furthermore show that the choice of background subtraction algorithm is dependent on the expression level of the protein under investigation, and that the localization error is dependent on the distance of a focus from the bacterial edge and pole. Our results establish a set of guidelines for what signals can be analyzed to give a targeted spatial and intensity accuracy within a bacterial cell. PMID:26331246

  15. Quantitative analysis of technological innovation in minimally invasive surgery.

    PubMed

    Hughes-Hallett, A; Mayer, E K; Pratt, P J; Vale, J A; Darzi, A W

    2015-01-01

    In the past 30 years surgical practice has changed considerably owing to the advent of minimally invasive surgery (MIS). This paper investigates the changing surgical landscape chronologically and quantitatively, examining the technologies that have played, and are forecast to play, the largest part in this shift in surgical practice. Electronic patent and publication databases were searched over the interval 1980-2011 for ('minimally invasive' OR laparoscopic OR laparoscopy OR 'minimal access' OR 'key hole') AND (surgery OR surgical OR surgeon). The resulting patent codes were allocated into technology clusters. Technology clusters referred to repeatedly in the contemporary surgical literature were also included in the analysis. Growth curves of patents and publications for the resulting technology clusters were then plotted. The initial search revealed 27,920 patents and 95,420 publications meeting the search criteria. The clusters meeting the criteria for in-depth analysis were: instruments, image guidance, surgical robotics, sutures, single-incision laparoscopic surgery (SILS) and natural-orifice transluminal endoscopic surgery (NOTES). Three patterns of growth were observed among these technology clusters: an S-shape (instruments and sutures), a gradual exponential rise (surgical robotics and image guidance), and a rapid contemporaneous exponential rise (NOTES and SILS). Technological innovation in MIS has been largely stagnant since its initial inception nearly 30 years ago, with few novel technologies emerging. The present study adds objective data to the previous claims that SILS, a surgical technique currently adopted by very few, represents an important part of the future of MIS. © 2015 BJS Society Ltd. Published by John Wiley & Sons Ltd.

  16. Bi-Temporal Analysis of Landscape Changes in the Easternmost Mediterranean Deltas Using Binary and Classified Change Information

    NASA Astrophysics Data System (ADS)

    Alphan, Hakan

    2013-03-01

    The aim of this study is (1) to quantify landscape changes in the easternmost Mediterranean deltas using bi-temporal binary change detection approach and (2) to analyze relationships between conservation/management designations and various categories of change that indicate type, degree and severity of human impact. For this purpose, image differencing and ratioing were applied to Landsat TM images of 1984 and 2006. A total of 136 candidate change images including normalized difference vegetation index (NDVI) and principal component analysis (PCA) difference images were tested to understand performance of bi-temporal pre-classification analysis procedures in the Mediterranean delta ecosystems. Resul