12 CFR 221.7 - Supplement: Maximum loan value of margin stock and other collateral.
Code of Federal Regulations, 2010 CFR
2010-01-01
... value of margin stock and other collateral. (a) Maximum loan value of margin stock. The maximum loan... nonmargin stock and all other collateral. The maximum loan value of nonmargin stock and all other collateral... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Supplement: Maximum loan value of margin stock...
Liu, Jiabin; Behrens, Timothy W.; Kearney, John F.
2014-01-01
Marginal Zone (MZ) B cells play an important role in the clearance of blood-borne bacterial infections via rapid T-independent IgM responses. We have previously demonstrated that MZ B cells respond rapidly and robustly to bacterial particulates. To determine the MZ-specific genes that are expressed to allow for this response, MZ and Follicular (FO) B cells were sort-purified and analyzed via DNA microarray analysis. We identified 181 genes that were significantly different between the two B cell populations. 99 genes were more highly expressed in MZ B cells while 82 genes were more highly expressed in FO B cells. To further understand the molecular mechanisms by which MZ B cells respond so rapidly to bacterial challenge, idiotype positive and negative MZ B cells were sort-purified before (0 hour) or after (1 hour) i.v. immunization with heat killed Streptococcus pneumoniae, R36A, and analyzed via DNA microarray analysis. We identified genes specifically up regulated or down regulated at 1 hour following immunization in the idiotype positive MZ B cells. These results give insight into the gene expression pattern in resting MZ vs. FO B cells and the specific regulation of gene expression in antigen-specific MZ B cells following interaction with antigen. PMID:18453586
Multiclass classification of microarray data samples with a reduced number of genes
2011-01-01
Background Multiclass classification of microarray data samples with a reduced number of genes is a rich and challenging problem in Bioinformatics research. The problem gets harder as the number of classes is increased. In addition, the performance of most classifiers is tightly linked to the effectiveness of mandatory gene selection methods. Critical to gene selection is the availability of estimates about the maximum number of genes that can be handled by any classification algorithm. Lack of such estimates may lead to either computationally demanding explorations of a search space with thousands of dimensions or classification models based on gene sets of unrestricted size. In the former case, unbiased but possibly overfitted classification models may arise. In the latter case, biased classification models unable to support statistically significant findings may be obtained. Results A novel bound on the maximum number of genes that can be handled by binary classifiers in binary mediated multiclass classification algorithms of microarray data samples is presented. The bound suggests that high-dimensional binary output domains might favor the existence of accurate and sparse binary mediated multiclass classifiers for microarray data samples. Conclusions A comprehensive experimental work shows that the bound is indeed useful to induce accurate and sparse multiclass classifiers for microarray data samples. PMID:21342522
Marginal Maximum A Posteriori Item Parameter Estimation for the Generalized Graded Unfolding Model
ERIC Educational Resources Information Center
Roberts, James S.; Thompson, Vanessa M.
2011-01-01
A marginal maximum a posteriori (MMAP) procedure was implemented to estimate item parameters in the generalized graded unfolding model (GGUM). Estimates from the MMAP method were compared with those derived from marginal maximum likelihood (MML) and Markov chain Monte Carlo (MCMC) procedures in a recovery simulation that varied sample size,…
ArrayNinja: An Open Source Platform for Unified Planning and Analysis of Microarray Experiments.
Dickson, B M; Cornett, E M; Ramjan, Z; Rothbart, S B
2016-01-01
Microarray-based proteomic platforms have emerged as valuable tools for studying various aspects of protein function, particularly in the field of chromatin biochemistry. Microarray technology itself is largely unrestricted in regard to printable material and platform design, and efficient multidimensional optimization of assay parameters requires fluidity in the design and analysis of custom print layouts. This motivates the need for streamlined software infrastructure that facilitates the combined planning and analysis of custom microarray experiments. To this end, we have developed ArrayNinja as a portable, open source, and interactive application that unifies the planning and visualization of microarray experiments and provides maximum flexibility to end users. Array experiments can be planned, stored to a private database, and merged with the imaged results for a level of data interaction and centralization that is not currently attainable with available microarray informatics tools. © 2016 Elsevier Inc. All rights reserved.
Thermodynamically optimal whole-genome tiling microarray design and validation.
Cho, Hyejin; Chou, Hui-Hsien
2016-06-13
Microarray is an efficient apparatus to interrogate the whole transcriptome of species. Microarray can be designed according to annotated gene sets, but the resulted microarrays cannot be used to identify novel transcripts and this design method is not applicable to unannotated species. Alternatively, a whole-genome tiling microarray can be designed using only genomic sequences without gene annotations, and it can be used to detect novel RNA transcripts as well as known genes. The difficulty with tiling microarray design lies in the tradeoff between probe-specificity and coverage of the genome. Sequence comparison methods based on BLAST or similar software are commonly employed in microarray design, but they cannot precisely determine the subtle thermodynamic competition between probe targets and partially matched probe nontargets during hybridizations. Using the whole-genome thermodynamic analysis software PICKY to design tiling microarrays, we can achieve maximum whole-genome coverage allowable under the thermodynamic constraints of each target genome. The resulted tiling microarrays are thermodynamically optimal in the sense that all selected probes share the same melting temperature separation range between their targets and closest nontargets, and no additional probes can be added without violating the specificity of the microarray to the target genome. This new design method was used to create two whole-genome tiling microarrays for Escherichia coli MG1655 and Agrobacterium tumefaciens C58 and the experiment results validated the design.
Research of facial feature extraction based on MMC
NASA Astrophysics Data System (ADS)
Xue, Donglin; Zhao, Jiufen; Tang, Qinhong; Shi, Shaokun
2017-07-01
Based on the maximum margin criterion (MMC), a new algorithm of statistically uncorrelated optimal discriminant vectors and a new algorithm of orthogonal optimal discriminant vectors for feature extraction were proposed. The purpose of the maximum margin criterion is to maximize the inter-class scatter while simultaneously minimizing the intra-class scatter after the projection. Compared with original MMC method and principal component analysis (PCA) method, the proposed methods are better in terms of reducing or eliminating the statistically correlation between features and improving recognition rate. The experiment results on Olivetti Research Laboratory (ORL) face database shows that the new feature extraction method of statistically uncorrelated maximum margin criterion (SUMMC) are better in terms of recognition rate and stability. Besides, the relations between maximum margin criterion and Fisher criterion for feature extraction were revealed.
Smith, Maria W.; Herfort, Lydie; Tyrol, Kaitlin; Suciu, Dominic; Campbell, Victoria; Crump, Byron C.; Peterson, Tawnya D.; Zuber, Peter; Baptista, Antonio M.; Simon, Holly M.
2010-01-01
Through their metabolic activities, microbial populations mediate the impact of high gradient regions on ecological function and productivity of the highly dynamic Columbia River coastal margin (CRCM). A 2226-probe oligonucleotide DNA microarray was developed to investigate expression patterns for microbial genes involved in nitrogen and carbon metabolism in the CRCM. Initial experiments with the environmental microarrays were directed toward validation of the platform and yielded high reproducibility in multiple tests. Bioinformatic and experimental validation also indicated that >85% of the microarray probes were specific for their corresponding target genes and for a few homologs within the same microbial family. The validated probe set was used to query gene expression responses by microbial assemblages to environmental variability. Sixty-four samples from the river, estuary, plume, and adjacent ocean were collected in different seasons and analyzed to correlate the measured variability in chemical, physical and biological water parameters to differences in global gene expression profiles. The method produced robust seasonal profiles corresponding to pre-freshet spring (April) and late summer (August). Overall relative gene expression was high in both seasons and was consistent with high microbial abundance measured by total RNA, heterotrophic bacterial production, and chlorophyll a. Both seasonal patterns involved large numbers of genes that were highly expressed relative to background, yet each produced very different gene expression profiles. April patterns revealed high differential gene expression in the coastal margin samples (estuary, plume and adjacent ocean) relative to freshwater, while little differential gene expression was observed along the river-to-ocean transition in August. Microbial gene expression profiles appeared to relate, in part, to seasonal differences in nutrient availability and potential resource competition. Furthermore, our results suggest that highly-active particle-attached microbiota in the Columbia River water column may perform dissimilatory nitrate reduction (both dentrification and DNRA) within anoxic particle microniches. PMID:20967204
Probe classification of on-off type DNA microarray images with a nonlinear matching measure
NASA Astrophysics Data System (ADS)
Ryu, Munho; Kim, Jong Dae; Min, Byoung Goo; Kim, Jongwon; Kim, Y. Y.
2006-01-01
We propose a nonlinear matching measure, called counting measure, as a signal detection measure that is defined as the number of on pixels in the spot area. It is applied to classify probes for an on-off type DNA microarray, where each probe spot is classified as hybridized or not. The counting measure also incorporates the maximum response search method, where the expected signal is obtained by taking the maximum among the measured responses of the various positions and sizes of the spot template. The counting measure was compared to existing signal detection measures such as the normalized covariance and the median for 2390 patient samples tested on the human papillomavirus (HPV) DNA chip. The counting measure performed the best regardless of whether or not the maximum response search method was used. The experimental results showed that the counting measure combined with the positional search was the most preferable.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Gillet, Jean-Pierre; Molina, Thierry Jo; Jamart, Jacques; Gaulard, Philippe; Leroy, Karen; Briere, Josette; Theate, Ivan; Thieblemont, Catherine; Bosly, Andre; Herin, Michel; Hamels, Jacques; Remacle, Jose
2009-03-01
Lymphomas are classified according to the World Health Organisation (WHO) classification which defines subtypes on the basis of clinical, morphological, immunophenotypic, molecular and cytogenetic criteria. Differential diagnosis of the subtypes is sometimes difficult, especially for small B-cell lymphoma (SBCL). Standardisation of molecular genetic assays using multiple gene expression analysis by microarrays could be a useful complement to the current diagnosis. The aim of the present study was to develop a low density DNA microarray for the analysis of 107 genes associated with B-cell non-Hodgkin lymphoma and to evaluate its performance in the diagnosis of SBCL. A predictive tool based on Fisher discriminant analysis using a training set of 40 patients including four different subtypes (follicular lymphoma n = 15, mantle cell lymphoma n = 7, B-cell chronic lymphocytic leukemia n = 6 and splenic marginal zone lymphoma n = 12) was designed. A short additional preliminary analysis to gauge the accuracy of this signature was then performed on an external set of nine patients. Using this model, eight of nine of those samples were classified successfully. This pilot study demonstrates that such a microarray tool may be a promising diagnostic approach for small B-cell non-Hodgkin lymphoma.
SimArray: a user-friendly and user-configurable microarray design tool
Auburn, Richard P; Russell, Roslin R; Fischer, Bettina; Meadows, Lisa A; Sevillano Matilla, Santiago; Russell, Steven
2006-01-01
Background Microarrays were first developed to assess gene expression but are now also used to map protein-binding sites and to assess allelic variation between individuals. Regardless of the intended application, efficient production and appropriate array design are key determinants of experimental success. Inefficient production can make larger-scale studies prohibitively expensive, whereas poor array design makes normalisation and data analysis problematic. Results We have developed a user-friendly tool, SimArray, which generates a randomised spot layout, computes a maximum meta-grid area, and estimates the print time, in response to user-specified design decisions. Selected parameters include: the number of probes to be printed; the microtitre plate format; the printing pin configuration, and the achievable spot density. SimArray is compatible with all current robotic spotters that employ 96-, 384- or 1536-well microtitre plates, and can be configured to reflect most production environments. Print time and maximum meta-grid area estimates facilitate evaluation of each array design for its suitability. Randomisation of the spot layout facilitates correction of systematic biases by normalisation. Conclusion SimArray is intended to help both established researchers and those new to the microarray field to develop microarray designs with randomised spot layouts that are compatible with their specific production environment. SimArray is an open-source program and is available from . PMID:16509966
Unc, Adrian; Zurek, Ludek; Peterson, Greg; Narayanan, Sanjeev; Springthorpe, Susan V; Sattar, Syed A
2012-01-01
Potential risks associated with impaired surface water quality have commonly been evaluated by indirect description of potential sources using various fecal microbial indicators and derived source-tracking methods. These approaches are valuable for assessing and monitoring the impacts of land-use changes and changes in management practices at the source of contamination. A more detailed evaluation of putative etiologically significant genetic determinants can add value to these assessments. We evaluated the utility of using a microarray that integrates virulence genes with antibiotic and heavy metal resistance genes to describe and discriminate among spatially and seasonally distinct water samples from an agricultural watershed creek in Eastern Ontario. Because microarray signals may be analyzed as binomial distributions, the significance of ambiguous signals can be easily evaluated by using available off-the-shelf software. The FAMD software was used to evaluate uncertainties in the signal data. Analysis of multilocus fingerprinting data sets containing missing data has shown that, for the tested system, any variability in microarray signals had a marginal effect on data interpretation. For the tested watershed, results suggest that in general the wet fall season increased the downstream detection of virulence and resistance genes. Thus, the tested microarray technique has the potential to rapidly describe the quality of surface waters and thus to provide a qualitative tool to augment quantitative microbial risk assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Aksu, Yaman; Miller, David J; Kesidis, George; Yang, Qing X
2010-05-01
Feature selection for classification in high-dimensional spaces can improve generalization, reduce classifier complexity, and identify important, discriminating feature "markers." For support vector machine (SVM) classification, a widely used technique is recursive feature elimination (RFE). We demonstrate that RFE is not consistent with margin maximization, central to the SVM learning approach. We thus propose explicit margin-based feature elimination (MFE) for SVMs and demonstrate both improved margin and improved generalization, compared with RFE. Moreover, for the case of a nonlinear kernel, we show that RFE assumes that the squared weight vector 2-norm is strictly decreasing as features are eliminated. We demonstrate this is not true for the Gaussian kernel and, consequently, RFE may give poor results in this case. MFE for nonlinear kernels gives better margin and generalization. We also present an extension which achieves further margin gains, by optimizing only two degrees of freedom--the hyperplane's intercept and its squared 2-norm--with the weight vector orientation fixed. We finally introduce an extension that allows margin slackness. We compare against several alternatives, including RFE and a linear programming method that embeds feature selection within the classifier design. On high-dimensional gene microarray data sets, University of California at Irvine (UCI) repository data sets, and Alzheimer's disease brain image data, MFE methods give promising results.
ERIC Educational Resources Information Center
Casabianca, Jodi M.; Lewis, Charles
2015-01-01
Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…
ERIC Educational Resources Information Center
Kieftenbeld, Vincent; Natesan, Prathiba
2012-01-01
Markov chain Monte Carlo (MCMC) methods enable a fully Bayesian approach to parameter estimation of item response models. In this simulation study, the authors compared the recovery of graded response model parameters using marginal maximum likelihood (MML) and Gibbs sampling (MCMC) under various latent trait distributions, test lengths, and…
ERIC Educational Resources Information Center
Wollack, James A.; Bolt, Daniel M.; Cohen, Allan S.; Lee, Young-Sun
2002-01-01
Compared the quality of item parameter estimates for marginal maximum likelihood (MML) and Markov Chain Monte Carlo (MCMC) with the nominal response model using simulation. The quality of item parameter recovery was nearly identical for MML and MCMC, and both methods tended to produce good estimates. (SLD)
ERIC Educational Resources Information Center
Paek, Insu; Wilson, Mark
2011-01-01
This study elaborates the Rasch differential item functioning (DIF) model formulation under the marginal maximum likelihood estimation context. Also, the Rasch DIF model performance was examined and compared with the Mantel-Haenszel (MH) procedure in small sample and short test length conditions through simulations. The theoretically known…
Attig, J.W.; Hanson, P.R.; Rawling, J.E.; Young, A.R.; Carson, E.C.
2011-01-01
Samples for optical dating were collected to estimate the time of sediment deposition in small ice-marginal lakes in the Baraboo Hills of Wisconsin. These lakes formed high in the Baraboo Hills when drainage was blocked by the Green Bay Lobe when it was at or very near its maximum extent. Therefore, these optical ages provide control for the timing of the thinning and recession of the Green Bay Lobe from its maximum position. Sediment that accumulated in four small ice-marginal lakes was sampled and dated. Difficulties with field sampling and estimating dose rates made the interpretation of optical ages derived from samples from two of the lake basins problematic. Samples from the other two lake basins-South Bluff and Feltz basins-responded well during laboratory analysis and showed reasonably good agreement between the multiple ages produced at each site. These ages averaged 18.2. ka (n= 6) and 18.6. ka (n= 6), respectively. The optical ages from these two lake basins where we could carefully select sediment samples provide firm evidence that the Green Bay Lobe stood at or very near its maximum extent until about 18.5. ka.The persistence of ice-marginal lakes in these basins high in the Baraboo Hills indicates that the ice of the Green Bay Lobe had not experienced significant thinning near its margin prior to about 18.5. ka. These ages are the first to directly constrain the timing of the maximum extent of the Green Bay Lobe and the onset of deglaciation in the area for which the Wisconsin Glaciation was named. ?? 2011 Elsevier B.V.
Greenland ice sheet retreat since the Little Ice Age
NASA Astrophysics Data System (ADS)
Beitch, Marci J.
Late 20th century and 21st century satellite imagery of the perimeter of the Greenland Ice Sheet (GrIS) provide high resolution observations of the ice sheet margins. Examining changes in ice margin positions over time yield measurements of GrIS area change and rates of margin retreat. However, longer records of ice sheet margin change are needed to establish more accurate predictions of the ice sheet's future response to global conditions. In this study, the trimzone, the area of deglaciated terrain along the ice sheet edge that lacks mature vegetation cover, is used as a marker of the maximum extent of the ice from its most recent major advance during the Little Ice Age. We compile recently acquired Landsat ETM+ scenes covering the perimeter of the GrIS on which we map area loss on land-, lake-, and marine-terminating margins. We measure an area loss of 13,327 +/- 830 km2, which corresponds to 0.8% shrinkage of the ice sheet. This equates to an averaged horizontal retreat of 363 +/- 69 m across the entire GrIS margin. Mapping the areas exposed since the Little Ice Age maximum, circa 1900 C.E., yields a century-scale rate of change. On average the ice sheet lost an area of 120 +/- 16 km 2/yr, or retreated at a rate of 3.3 +/- 0.7 m/yr since the LIA maximum.
Zhang, Xiaomeng; Shao, Bin; Wu, Yangle; Qi, Ouyang
2013-01-01
One of the major objectives in systems biology is to understand the relation between the topological structures and the dynamics of biological regulatory networks. In this context, various mathematical tools have been developed to deduct structures of regulatory networks from microarray expression data. In general, from a single data set, one cannot deduct the whole network structure; additional expression data are usually needed. Thus how to design a microarray expression experiment in order to get the most information is a practical problem in systems biology. Here we propose three methods, namely, maximum distance method, trajectory entropy method, and sampling method, to derive the optimal initial conditions for experiments. The performance of these methods is tested and evaluated in three well-known regulatory networks (budding yeast cell cycle, fission yeast cell cycle, and E. coli. SOS network). Based on the evaluation, we propose an efficient strategy for the design of microarray expression experiments.
Impact of abutment rotation and angulation on marginal fit: theoretical considerations.
Semper, Wiebke; Kraft, Silvan; Mehrhof, Jurgen; Nelson, Katja
2010-01-01
Rotational freedom of various implant positional index designs has been previously calculated. To investigate its clinical relevance, a three-dimensional simulation was performed to demonstrate the influence of rotational displacements of the abutment on the marginal fit of prosthetic superstructures. Idealized abutments with different angulations (0, 5, 10, 15, and 20 degrees) were virtually constructed (SolidWorks Office Premium 2007). Then, rotational displacement was simulated with various degrees of rotational freedom (0.7, 0.95, 1.5, 1.65, and 1.85 degrees). The resulting horizontal displacement of the abutment from the original position was quantified in microns, followed by a simulated pressure-less positioning of superstructures with defined internal gaps (5 µm, 60 µm, and 100 µm). The resulting marginal gap between the abutment and the superstructure was measured vertically with the SolidWorks measurement tool. Rotation resulted in a displacement of the abutment of up to 157 µm at maximum rotation and angulation. Interference of a superstructure with a defined internal gap of 5 µm placed on the abutment resulted in marginal gaps up to 2.33 mm at maximum rotation and angulation; with a 60-µm internal gap, the marginal gaps reached a maximum of 802 µm. Simulation using a superstructure with an internal gap of 100 µm revealed a marginal gap of 162 µm at abutment angulation of 20 degrees and rotation of 1.85 degrees. The marginal gaps increased with the degree of abutment angulation and the extent of rotational freedom. Rotational displacement of the abutment influenced prosthesis misfit. The marginal gaps between the abutment and the superstructure increased with the rotational freedom of the index and the angulation of the abutment.
Hu, Wenjun; Chung, Fu-Lai; Wang, Shitong
2012-03-01
Although pattern classification has been extensively studied in the past decades, how to effectively solve the corresponding training on large datasets is a problem that still requires particular attention. Many kernelized classification methods, such as SVM and SVDD, can be formulated as the corresponding quadratic programming (QP) problems, but computing the associated kernel matrices requires O(n2)(or even up to O(n3)) computational complexity, where n is the size of the training patterns, which heavily limits the applicability of these methods for large datasets. In this paper, a new classification method called the maximum vector-angular margin classifier (MAMC) is first proposed based on the vector-angular margin to find an optimal vector c in the pattern feature space, and all the testing patterns can be classified in terms of the maximum vector-angular margin ρ, between the vector c and all the training data points. Accordingly, it is proved that the kernelized MAMC can be equivalently formulated as the kernelized Minimum Enclosing Ball (MEB), which leads to a distinctive merit of MAMC, i.e., it has the flexibility of controlling the sum of support vectors like v-SVC and may be extended to a maximum vector-angular margin core vector machine (MAMCVM) by connecting the core vector machine (CVM) method with MAMC such that the corresponding fast training on large datasets can be effectively achieved. Experimental results on artificial and real datasets are provided to validate the power of the proposed methods. Copyright © 2011 Elsevier Ltd. All rights reserved.
A New Distribution Family for Microarray Data †
Kelmansky, Diana Mabel; Ricci, Lila
2017-01-01
The traditional approach with microarray data has been to apply transformations that approximately normalize them, with the drawback of losing the original scale. The alternative standpoint taken here is to search for models that fit the data, characterized by the presence of negative values, preserving their scale; one advantage of this strategy is that it facilitates a direct interpretation of the results. A new family of distributions named gpower-normal indexed by p∈R is introduced and it is proven that these variables become normal or truncated normal when a suitable gpower transformation is applied. Expressions are given for moments and quantiles, in terms of the truncated normal density. This new family can be used to model asymmetric data that include non-positive values, as required for microarray analysis. Moreover, it has been proven that the gpower-normal family is a special case of pseudo-dispersion models, inheriting all the good properties of these models, such as asymptotic normality for small variances. A combined maximum likelihood method is proposed to estimate the model parameters, and it is applied to microarray and contamination data. R codes are available from the authors upon request. PMID:28208652
A New Distribution Family for Microarray Data.
Kelmansky, Diana Mabel; Ricci, Lila
2017-02-10
The traditional approach with microarray data has been to apply transformations that approximately normalize them, with the drawback of losing the original scale. The alternative stand point taken here is to search for models that fit the data, characterized by the presence of negative values, preserving their scale; one advantage of this strategy is that it facilitates a direct interpretation of the results. A new family of distributions named gpower-normal indexed by p∈R is introduced and it is proven that these variables become normal or truncated normal when a suitable gpower transformation is applied. Expressions are given for moments and quantiles, in terms of the truncated normal density. This new family can be used to model asymmetric data that include non-positive values, as required for microarray analysis. Moreover, it has been proven that the gpower-normal family is a special case of pseudo-dispersion models, inheriting all the good properties of these models, such as asymptotic normality for small variances. A combined maximum likelihood method is proposed to estimate the model parameters, and it is applied to microarray and contamination data. Rcodes are available from the authors upon request.
Models and analysis for multivariate failure time data
NASA Astrophysics Data System (ADS)
Shih, Joanna Huang
The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the performance of these two methods using actual and computer generated data.
Goel, Meenal; Verma, Abhishek; Gupta, Shalini
2018-07-15
Microarray technology to isolate living cells using external fields is a facile way to do phenotypic analysis at the cellular level. We have used alternating current dielectrophoresis (AC-DEP) to drive the assembly of live pathogenic Salmonella typhi (S.typhi) and Escherichia coli (E.coli) bacteria into miniaturized single cell microarrays. The effects of voltage and frequency were optimized to identify the conditions for maximum cell capture which gave an entrapment efficiency of 90% in 60 min. The chip was used for calibration-free estimation of cellular loads in binary mixtures and further applied for rapid and enhanced testing of cell viability in the presence of drug via impedance spectroscopy. Our results using a model antimicrobial sushi peptide showed that the cell viability could be tested down to 5 μg/mL drug concentration under an hour, thus establishing the utility of our system for ultrafast and sensitive detection. Copyright © 2018 Elsevier B.V. All rights reserved.
Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein
2016-06-01
This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.
Turkec, Aydin; Lucas, Stuart J; Karacanli, Burçin; Baykut, Aykut; Yuksel, Hakki
2016-03-01
Detection of GMO material in crop and food samples is the primary step in GMO monitoring and regulation, with the increasing number of GM events in the world market requiring detection solutions with high multiplexing capacity. In this study, we test the suitability of a high-density oligonucleotide microarray platform for direct, quantitative detection of GMOs found in the Turkish feed market. We tested 1830 different 60nt probes designed to cover the GM cassettes from 12 different GM cultivars (3 soya, 9 maize), as well as plant species-specific and contamination controls, and developed a data analysis method aiming to provide maximum throughput and sensitivity. The system was able specifically to identify each cultivar, and in 10/12 cases was sensitive enough to detect GMO DNA at concentrations of ⩽1%. These GMOs could also be quantified using the microarray, as their fluorescence signals increased linearly with GMO concentration. Copyright © 2015 Elsevier Ltd. All rights reserved.
Martins, Diogo; Wei, Xi; Levicky, Rastislav; Song, Yong-Ak
2016-04-05
We describe a microfluidic concentration device to accelerate the surface hybridization reaction between DNA and morpholinos (MOs) for enhanced detection. The microfluidic concentrator comprises a single polydimethylsiloxane (PDMS) microchannel onto which an ion-selective layer of conductive polymer poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) ( PSS) was directly printed and then reversibly surface bonded onto a morpholino microarray for hybridization. Using this electrokinetic trapping concentrator, we could achieve a maximum concentration factor of ∼800 for DNA and a limit of detection of 10 nM within 15 min. In terms of the detection speed, it enabled faster hybridization by around 10-fold when compared to conventional diffusion-based hybridization. A significant advantage of our approach is that the fabrication of the microfluidic concentrator is completely decoupled from the microarray; by eliminating the need to deposit an ion-selective layer on the microarray surface prior to device integration, interfacing between both modules, the PDMS chip for electrokinetic concentration and the substrate for DNA sensing are easier and applicable to any microarray platform. Furthermore, this fabrication strategy facilitates a multiplexing of concentrators. We have demonstrated the proof-of-concept for multiplexing by building a device with 5 parallel concentrators connected to a single inlet/outlet and applying it to parallel concentration and hybridization. Such device yielded similar concentration and hybridization efficiency compared to that of a single-channel device without adding any complexity to the fabrication and setup. These results demonstrate that our concentrator concept can be applied to the development of a highly multiplexed concentrator-enhanced microarray detection system for either genetic analysis or other diagnostic assays.
Reuse of imputed data in microarray analysis increases imputation efficiency
Kim, Ki-Yeol; Kim, Byoung-Jin; Yi, Gwan-Su
2004-01-01
Background The imputation of missing values is necessary for the efficient use of DNA microarray data, because many clustering algorithms and some statistical analysis require a complete data set. A few imputation methods for DNA microarray data have been introduced, but the efficiency of the methods was low and the validity of imputed values in these methods had not been fully checked. Results We developed a new cluster-based imputation method called sequential K-nearest neighbor (SKNN) method. This imputes the missing values sequentially from the gene having least missing values, and uses the imputed values for the later imputation. Although it uses the imputed values, the efficiency of this new method is greatly improved in its accuracy and computational complexity over the conventional KNN-based method and other methods based on maximum likelihood estimation. The performance of SKNN was in particular higher than other imputation methods for the data with high missing rates and large number of experiments. Application of Expectation Maximization (EM) to the SKNN method improved the accuracy, but increased computational time proportional to the number of iterations. The Multiple Imputation (MI) method, which is well known but not applied previously to microarray data, showed a similarly high accuracy as the SKNN method, with slightly higher dependency on the types of data sets. Conclusions Sequential reuse of imputed data in KNN-based imputation greatly increases the efficiency of imputation. The SKNN method should be practically useful to save the data of some microarray experiments which have high amounts of missing entries. The SKNN method generates reliable imputed values which can be used for further cluster-based analysis of microarray data. PMID:15504240
Vigneron, Adrien; Cruaud, Perrine; Roussel, Erwan G.; Pignet, Patricia; Caprais, Jean-Claude; Callac, Nolwenn; Ciobanu, Maria-Cristina; Godfroy, Anne; Cragg, Barry A.; Parkes, John R.; Van Nostrand, Joy D.; He, Zhili; Zhou, Jizhong; Toffin, Laurent
2014-01-01
Subsurface sediments of the Sonora Margin (Guaymas Basin), located in proximity of active cold seep sites were explored. The taxonomic and functional diversity of bacterial and archaeal communities were investigated from 1 to 10 meters below the seafloor. Microbial community structure and abundance and distribution of dominant populations were assessed using complementary molecular approaches (Ribosomal Intergenic Spacer Analysis, 16S rRNA libraries and quantitative PCR with an extensive primers set) and correlated to comprehensive geochemical data. Moreover the metabolic potentials and functional traits of the microbial community were also identified using the GeoChip functional gene microarray and metabolic rates. The active microbial community structure in the Sonora Margin sediments was related to deep subsurface ecosystems (Marine Benthic Groups B and D, Miscellaneous Crenarchaeotal Group, Chloroflexi and Candidate divisions) and remained relatively similar throughout the sediment section, despite defined biogeochemical gradients. However, relative abundances of bacterial and archaeal dominant lineages were significantly correlated with organic carbon quantity and origin. Consistently, metabolic pathways for the degradation and assimilation of this organic carbon as well as genetic potentials for the transformation of detrital organic matters, hydrocarbons and recalcitrant substrates were detected, suggesting that chemoorganotrophic microorganisms may dominate the microbial community of the Sonora Margin subsurface sediments. PMID:25099369
Patil, Abhijit; Singh, Kishan; Sahoo, Sukant; Suvarna, Suraj; Kumar, Prince; Singh, Anupam
2013-01-01
Objective: The aims of the study are to assess the marginal accuracy of base metal and titanium alloy casting and to evaluate the effect of repeated ceramic firing on the marginal accuracy of base metal and titanium alloy castings. Materials and Methods: Twenty metal copings were fabricated with each casting material. Specimens were divided into 4 groups of 10 each representing base metal alloys castings without (Group A) and with metal shoulder margin (Group B), titanium castings without (Group C) and with metal shoulder margin (Group D). The measurement of fit of the metal copings was carried out before the ceramic firing at four different points and the same was followed after porcelain build-up. Results: Significant difference was found when Ni–Cr alloy samples were compared with Grade II titanium samples both before and after ceramic firings. The titanium castings with metal shoulder margin showed highest microgap among all the materials tested. Conclusions: Based on the results that were found and within the limitations of the study design, it can be concluded that there is marginal discrepancy in the copings made from Ni–Cr and Grade II titanium. This marginal discrepancy increased after ceramic firing cycles for both Ni–Cr and Grade II titanium. The comparative statistical analysis for copings with metal-collar showed maximum discrepancy for Group D. The comparative statistical analysis for copings without metal-collar showed maximum discrepancy for Group C. PMID:24926205
Bioeconomic Sustainability of Cellulosic Biofuel Production on Marginal Lands
ERIC Educational Resources Information Center
Gutierrez, Andrew Paul; Ponti, Luigi
2009-01-01
The use of marginal land (ML) for lignocellulosic biofuel production is examined for system stability, resilience, and eco-social sustainability. A North American prairie grass system and its industrialization for maximum biomass production using biotechnology and agro-technical inputs is the focus of the analysis. Demographic models of ML biomass…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turley, Jessica; Claridge Mackonis, Elizabeth
To evaluate in-field megavoltage (MV) imaging of simultaneously integrated boost (SIB) breast fields to determine its feasibility in treatment verification for the SIB breast radiotherapy technique, and to assess whether the current-imaging protocol and treatment margins are sufficient. For nine patients undergoing SIB breast radiotherapy, in-field MV images of the SIB fields were acquired on days that regular treatment verification imaging was performed. The in-field images were matched offline according to the scar wire on digitally reconstructed radiographs. The offline image correction results were then applied to a margin recipe formula to calculate safe margins that account for random andmore » systematic uncertainties in the position of the boost volume when an offline correction protocol has been applied. After offline assessment of the acquired images, 96% were within the tolerance set in the current department-imaging protocol. Retrospectively performing the maximum position deviations on the Eclipse™ treatment planning system demonstrated that the clinical target volume (CTV) boost received a minimum dose difference of 0.4% and a maximum dose difference of 1.4% less than planned. Furthermore, applying our results to the Van Herk margin formula to ensure that 90% of patients receive 95% of the prescribed dose, the calculated CTV margins were comparable to the current departmental procedure used. Based on the in-field boost images acquired and the feasible application of these results to the margin formula the current CTV-planning target volume margins used are appropriate for the accurate treatment of the SIB boost volume without additional imaging.« less
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems. PMID:25961028
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems.
NASA Astrophysics Data System (ADS)
Hwang, Taejin; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk
2015-07-01
The aim of this study was to quantitatively estimate the dosimetric benefits of the image-guided radiation therapy (IGRT) system for the prostate intensity-modulated radiation therapy (IMRT) delivery. The cases of eleven patients who underwent IMRT for prostate cancer without a prostatectomy at our institution between October 2012 and April 2014 were retrospectively analyzed. For every patient, clinical target volume (CTV) to planning target volume (PTV) margins were uniformly used: 3 mm, 5 mm, 7 mm, 10 mm, 12 mm, and 15 mm. For each margin size, the IMRT plans were independently optimized by one medical physicist using Pinnalce3 (ver. 8.0.d, Philips Medical System, Madison, WI) in order to maintain the plan quality. The maximum geometrical margin (MGM) for every CT image set, defined as the smallest margin encompassing the rectum at least at one slice, was between 13 mm and 26 mm. The percentage rectum overlapping PTV (%V ROV ), the rectal normal tissue complication probability (NTCP) and the mean rectal dose (%RD mean ) increased in proportion to the increase of PTV margin. However the bladder NTCP remained around zero to some extent regardless of the increase of PTV margin while the percentage bladder overlapping PTV (%V BOV ) and the mean bladder dose (%BD mean ) increased in proportion to the increase of PTV margin. Without relatively large rectum or small bladder, the increase observed for rectal NTCP, %RDmean and %BD mean per 1-mm PTV margin size were 1.84%, 2.44% and 2.90%, respectively. Unlike the behavior of the rectum or the bladder, the maximum dose on each femoral head had little effect on PTV margin. This quantitative study of the PTV margin reduction supported that IG-IMRT has enhanced the clinical effects over prostate cancer with the reduction of normal organ complications under the similar level of PTV control.
NASA Astrophysics Data System (ADS)
Patton, Henry; Hubbard, Alun; Andreassen, Karin; Winsborrow, Monica; Stroeven, Arjen P.
2016-12-01
The Eurasian ice-sheet complex (EISC) was the third largest ice mass during the Last Glacial Maximum (LGM), after the Antarctic and North American ice sheets. Despite its global significance, a comprehensive account of its evolution from independent nucleation centres to its maximum extent is conspicuously lacking. Here, a first-order, thermomechanical model, robustly constrained by empirical evidence, is used to investigate the dynamics of the EISC throughout its build-up to its maximum configuration. The ice flow model is coupled to a reference climate and applied at 10 km spatial resolution across a domain that includes the three main spreading centres of the Celtic, Fennoscandian and Barents Sea ice sheets. The model is forced with the NGRIP palaeo-isotope curve from 37 ka BP onwards and model skill is assessed against collated flowsets, marginal moraines, exposure ages and relative sea-level history. The evolution of the EISC to its LGM configuration was complex and asynchronous; the western, maritime margins of the Fennoscandian and Celtic ice sheets responded rapidly and advanced across their continental shelves by 29 ka BP, yet the maximum aerial extent (5.48 × 106 km2) and volume (7.18 × 106 km3) of the ice complex was attained some 6 ka later at c. 22.7 ka BP. This maximum stand was short-lived as the North Sea and Atlantic margins were already in retreat whilst eastern margins were still advancing up until c. 20 ka BP. High rates of basal erosion are modelled beneath ice streams and outlet glaciers draining the Celtic and Fennoscandian ice sheets with extensive preservation elsewhere due to frozen subglacial conditions, including much of the Barents and Kara seas. Here, and elsewhere across the Norwegian shelf and North Sea, high pressure subglacial conditions would have promoted localised gas hydrate formation.
Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable
ERIC Educational Resources Information Center
du Toit, Stephen H. C.; Cudeck, Robert
2009-01-01
A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…
Modeling and analysis of energy quantization effects on single electron inverter performance
NASA Astrophysics Data System (ADS)
Dan, Surya Shankar; Mahapatra, Santanu
2009-08-01
In this paper, for the first time, the effects of energy quantization on single electron transistor (SET) inverter performance are analyzed through analytical modeling and Monte Carlo simulations. It is shown that energy quantization mainly changes the Coulomb blockade region and drain current of SET devices and thus affects the noise margin, power dissipation, and the propagation delay of SET inverter. A new analytical model for the noise margin of SET inverter is proposed which includes the energy quantization effects. Using the noise margin as a metric, the robustness of SET inverter is studied against the effects of energy quantization. A compact expression is developed for a novel parameter quantization threshold which is introduced for the first time in this paper. Quantization threshold explicitly defines the maximum energy quantization that an SET inverter logic circuit can withstand before its noise margin falls below a specified tolerance level. It is found that SET inverter designed with CT:CG=1/3 (where CT and CG are tunnel junction and gate capacitances, respectively) offers maximum robustness against energy quantization.
Guo, Jing; Wang, Xiao-Yu; Li, Xue-Sheng; Sun, Hai-Yang; Liu, Lin; Li, Hong-Bo
2016-02-01
To evaluate the effect of different designs of marginal preparation on stress distribution in the mandibular premolar restored with endocrown using three-dimensional finite element method. Four models with different designs of marginal preparation, including the flat margin, 90° shoulder, 135° shoulder and chamfer shoulder, were established to imitate mandibular first premolar restored with endocrown. A load of 100 N was applied to the intersection of the long axis and the occlusal surface, either parallel or with an angle of 45° to the long axis of the tooth. The maximum values of Von Mises stress and the stress distribution around the cervical region of the abutment and the endocrown with different designs of marginal preparation were analyzed. The load parallel to the long axis of the tooth caused obvious stress concentration in the lingual portions of both the cervical region of the tooth tissue and the restoration. The stress distribution characteristics on the cervical region of the models with a flat margin and a 90° shoulder were more uniform than those in the models with a 135° shoulder and chamfer shoulder. Loading at 45° to the long axis caused stress concentration mainly on the buccal portion of the cervical region, and the model with a flat margin showed the most favorable stress distribution patterns with a greater maximum Von Mises stress under this circumstance than that with a parallel loading. Irrespective of the loading direction, the stress value was the lowest in the flat margin model, where the stress value in the cervical region of the endocrown was greater than that in the counterpart of the tooth tissue. The stress level on the enamel was higher than that on the dentin nearby in the flat margin model. From the stress distribution point of view, endocrowns with flat margin followed by a 90° shoulder are recommended.
Minervini, Andrea; Campi, Riccardo; Kutikov, Alexander; Montagnani, Ilaria; Sessa, Francesco; Serni, Sergio; Raspollini, Maria Rosaria; Carini, Marco
2015-10-01
The surface-intermediate-base margin score is a novel standardized reporting system of resection techniques during nephron sparing surgery. We validated the surgeon assessed surface-intermediate-base score with microscopic histopathological assessment of partial nephrectomy specimens. Between June and August 2014 data were prospectively collected from 40 consecutive patients undergoing nephron sparing surgery. The surface-intermediate-base score was assigned to all cases. The score specific areas were color coded with tissue margin ink and sectioned for histological evaluation of healthy renal margin thickness. Maximum, minimum and mean thickness of healthy renal margin for each score specific area grade (surface [S] = 0, S = 1 ; intermediate [I] or base [B] = 0, I or B = 1, I or B = 2) was reported. The Mann-Whitney U and Kruskal-Wallis tests were used to compare the thickness of healthy renal margin in S = 0 vs 1 and I or B = 0 vs 1 vs 2 grades, respectively. Maximum, minimum and mean thickness of healthy renal margin was significantly different among score specific area grades S = 0 vs 1, and I or B = 0 vs 1, 0 vs 2 and 1 vs 2 (p <0.001). The main limitations of the study are the low number of the I or B = 1 and I or B = 2 samples and the assumption that each microscopic slide reflects the entire score specific area for histological analysis. The surface-intermediate-base scoring method can be readily harnessed in real-world clinical practice and accurately mirrors histopathological analysis for quantification and reporting of healthy renal margin thickness removed during tumor excision. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Mining subspace clusters from DNA microarray data using large itemset techniques.
Chang, Ye-In; Chen, Jiun-Rung; Tsai, Yueh-Chi
2009-05-01
Mining subspace clusters from the DNA microarrays could help researchers identify those genes which commonly contribute to a disease, where a subspace cluster indicates a subset of genes whose expression levels are similar under a subset of conditions. Since in a DNA microarray, the number of genes is far larger than the number of conditions, those previous proposed algorithms which compute the maximum dimension sets (MDSs) for any two genes will take a long time to mine subspace clusters. In this article, we propose the Large Itemset-Based Clustering (LISC) algorithm for mining subspace clusters. Instead of constructing MDSs for any two genes, we construct only MDSs for any two conditions. Then, we transform the task of finding the maximal possible gene sets into the problem of mining large itemsets from the condition-pair MDSs. Since we are only interested in those subspace clusters with gene sets as large as possible, it is desirable to pay attention to those gene sets which have reasonable large support values in the condition-pair MDSs. From our simulation results, we show that the proposed algorithm needs shorter processing time than those previous proposed algorithms which need to construct gene-pair MDSs.
Reddy, S Srikanth; Revathi, Kakkirala; Reddy, S Kranthikumar
2013-01-01
Conventional casting technique is time consuming when compared to accelerated casting technique. In this study, marginal accuracy of castings fabricated using accelerated and conventional casting technique was compared. 20 wax patterns were fabricated and the marginal discrepancy between the die and patterns were measured using Optical stereomicroscope. Ten wax patterns were used for Conventional casting and the rest for Accelerated casting. A Nickel-Chromium alloy was used for the casting. The castings were measured for marginal discrepancies and compared. Castings fabricated using Conventional casting technique showed less vertical marginal discrepancy than the castings fabricated by Accelerated casting technique. The values were statistically highly significant. Conventional casting technique produced better marginal accuracy when compared to Accelerated casting. The vertical marginal discrepancy produced by the Accelerated casting technique was well within the maximum clinical tolerance limits. Accelerated casting technique can be used to save lab time to fabricate clinical crowns with acceptable vertical marginal discrepancy.
A new and efficient theoretical model to analyze chirped grating distributed feedback lasers
NASA Astrophysics Data System (ADS)
Arif, Muhammad
Threshold conditions of a distributed feedback (DFB) laser with a linearly chirped grating are investigated using a new and efficient method. DFB laser with chirped grating is found to have significant effects on the lasing characteristics. The coupled wave equations for these lasers are derived and solved using a power series method to obtain the threshold condition. A Newton- Raphson routine is used to solve the threshold conditions numerically to obtain threshold gain and lasing wavelengths. To prove the validity of this model, it is applied to both conventional index-coupled and complex- coupled DFB lasers. The threshold gain margins are calculated as functions of the ratio of the gain coupling to index coupling (|κg|/|κ n|), and the phase difference between the index and gain gratings. It was found that for coupling coefficient |κ|l < 0.9, the laser shows a mode degeneracy at particular values of the ratio |κ g|/|κn|, for cleaved facets. We found that at phase differences π/2 and 3π/2, between the gain and index grating, for an AR-coated complex-coupled laser, the laser becomes multimode and a different mode starts to lase. We also studied the effect of the facet reflectivity (both magnitude and phase) on the gain margin of a complex- coupled DFB laser. Although, the gain margin varies slowly with the magnitude of the facet reflectivity, it shows large variations as a function of the phase. Spatial hole burning was found to be minimum at phase difference nπ, n = 0, 1, ... and maximum at phase differences π/2 and 3π/2. The single mode gain margin of an index-coupled linearly chirped CG-DFB is calculated for different chirping factors and coupling constants. We found that there is clearly an optimum chirping for which the single mode gain margin is maximum. The gain margins were calculated also for different positions of the cavity center. The effect of the facet reflectivities and their phases on the gain margin was investigated. We found the gain margin is maximum and the Spatial Hole Burning (SHB) is minimum for the cavity center at the middle of the laser cavity. Effect of chirping on the threshold gain, gain margin and spatial hole burning (SHB) for different parameters, such as the coupling coefficients, facet reflectivities, etc., of these lasers are studied. Single mode yield of these lasers are calculated and compared with that of a uniform grating DFB laser.
ten Brink, Uri S.; Lee, H.J.; Geist, E.L.; Twichell, D.
2009-01-01
Submarine landslides along the continental slope of the U.S. Atlantic margin are potential sources for tsunamis along the U.S. East coast. The magnitude of potential tsunamis depends on the volume and location of the landslides, and tsunami frequency depends on their recurrence interval. However, the size and recurrence interval of submarine landslides along the U.S. Atlantic margin is poorly known. Well-studied landslide-generated tsunamis in other parts of the world have been shown to be associated with earthquakes. Because the size distribution and recurrence interval of earthquakes is generally better known than those for submarine landslides, we propose here to estimate the size and recurrence interval of submarine landslides from the size and recurrence interval of earthquakes in the near vicinity of the said landslides. To do so, we calculate maximum expected landslide size for a given earthquake magnitude, use recurrence interval of earthquakes to estimate recurrence interval of landslide, and assume a threshold landslide size that can generate a destructive tsunami. The maximum expected landslide size for a given earthquake magnitude is calculated in 3 ways: by slope stability analysis for catastrophic slope failure on the Atlantic continental margin, by using land-based compilation of maximum observed distance from earthquake to liquefaction, and by using land-based compilation of maximum observed area of earthquake-induced landslides. We find that the calculated distances and failure areas from the slope stability analysis is similar or slightly smaller than the maximum triggering distances and failure areas in subaerial observations. The results from all three methods compare well with the slope failure observations of the Mw = 7.2, 1929 Grand Banks earthquake, the only historical tsunamigenic earthquake along the North American Atlantic margin. The results further suggest that a Mw = 7.5 earthquake (the largest expected earthquake in the eastern U.S.) must be located offshore and within 100??km of the continental slope to induce a catastrophic slope failure. Thus, a repeat of the 1755 Cape Anne and 1881 Charleston earthquakes are not expected to cause landslides on the continental slope. The observed rate of seismicity offshore the U.S. Atlantic coast is very low with the exception of New England, where some microseismicity is observed. An extrapolation of annual strain rates from the Canadian Atlantic continental margin suggests that the New England margin may experience the equivalent of a magnitude 7 earthquake on average every 600-3000??yr. A minimum triggering earthquake magnitude of 5.5 is suggested for a sufficiently large submarine failure to generate a devastating tsunami and only if the epicenter is located within the continental slope.
Segmentation and intensity estimation of microarray images using a gamma-t mixture model.
Baek, Jangsun; Son, Young Sook; McLachlan, Geoffrey J
2007-02-15
We present a new approach to the analysis of images for complementary DNA microarray experiments. The image segmentation and intensity estimation are performed simultaneously by adopting a two-component mixture model. One component of this mixture corresponds to the distribution of the background intensity, while the other corresponds to the distribution of the foreground intensity. The intensity measurement is a bivariate vector consisting of red and green intensities. The background intensity component is modeled by the bivariate gamma distribution, whose marginal densities for the red and green intensities are independent three-parameter gamma distributions with different parameters. The foreground intensity component is taken to be the bivariate t distribution, with the constraint that the mean of the foreground is greater than that of the background for each of the two colors. The degrees of freedom of this t distribution are inferred from the data but they could be specified in advance to reduce the computation time. Also, the covariance matrix is not restricted to being diagonal and so it allows for nonzero correlation between R and G foreground intensities. This gamma-t mixture model is fitted by maximum likelihood via the EM algorithm. A final step is executed whereby nonparametric (kernel) smoothing is undertaken of the posterior probabilities of component membership. The main advantages of this approach are: (1) it enjoys the well-known strengths of a mixture model, namely flexibility and adaptability to the data; (2) it considers the segmentation and intensity simultaneously and not separately as in commonly used existing software, and it also works with the red and green intensities in a bivariate framework as opposed to their separate estimation via univariate methods; (3) the use of the three-parameter gamma distribution for the background red and green intensities provides a much better fit than the normal (log normal) or t distributions; (4) the use of the bivariate t distribution for the foreground intensity provides a model that is less sensitive to extreme observations; (5) as a consequence of the aforementioned properties, it allows segmentation to be undertaken for a wide range of spot shapes, including doughnut, sickle shape and artifacts. We apply our method for gridding, segmentation and estimation to cDNA microarray real images and artificial data. Our method provides better segmentation results in spot shapes as well as intensity estimation than Spot and spotSegmentation R language softwares. It detected blank spots as well as bright artifact for the real data, and estimated spot intensities with high-accuracy for the synthetic data. The algorithms were implemented in Matlab. The Matlab codes implementing both the gridding and segmentation/estimation are available upon request. Supplementary material is available at Bioinformatics online.
Mori, Hirohito; Kobara, Hideki; Nishiyama, Noriko; Fujihara, Shintaro; Kobayashi, Nobuya; Ayaki, Maki; Masaki, Tsutomu
2016-11-01
Although endoscopic mucosal resection is an established colorectal polyp treatment, local recurrence occurs in 13 % of cases due to inadequate snaring. We evaluated whether pre-clipping to the muscularis propria resulted in resected specimens with negative surgical margins without thermal denaturation. Of 245 polyps from 114 patients with colorectal polyps under 20 mm, we included 188 polyps from 81 patients. We randomly allocated polyps to the conventional injection group (CG) (97 polyps) or the pre-clipping injection group (PG) (91 polyps). The PG received three-point pre-clipping to ensure ample gripping to the muscle layer on the oral and both sides of the tumor with 4 mL local injection. Endoscopic ultrasonography was performed to measure the resulting bulge. Outcomes included the number of instances of thermal denaturation of the horizontal/vertical margin (HMX/VMX) or positive horizontal/vertical margins (HM+/VM+), the shortest distance from tumor margins to resected edges, and the maximum bulge distances from tumor surface to the muscularis propria. The numbers of HMX and HM+ in the CG and PG were 27 and 6, and 9 and 2 (P = 0.001), and VMX and VM+ were 8 and 5, and 0 and 0 (P = 0.057). The shortest distance from tumor margin to resected edge [median (range), mm] in polyps in the CG and PG was 0.6 (0-2.7) and 4.7 (2.1-8.9) (P = 0.018). The maximum bulge distances were 4.6 (3.0-8.0) and 11.0 (6.8-17.0) (P = 0.005). Pre-clipping enabled surgical margin-negative resection without thermal denaturation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgiades, Christos, E-mail: g_christos@hotmail.com; Rodriguez, Ronald, E-mail: rrodrig@jhmi.edu; Azene, Ezana, E-mail: eazene1@jhmi.edu
2013-06-15
Objective. The study was designed to determine the distance between the visible 'ice-ball' and the lethal temperature isotherm for normal renal tissue during cryoablation. Methods. The Animal Care Committee approved the study. Nine adult swine were used: three to determine the optimum tissue stain and six to test the hypotheses. They were anesthetized and the left renal artery was catheterized under fluoroscopy. Under MR guidance, the kidney was ablated and (at end of a complete ablation) the nonfrozen renal tissue (surrounding the 'ice-ball') was stained via renal artery catheter. Kidneys were explanted and sent for slide preparation and examination. Frommore » each slide, we measured the maximum, minimum, and an in-between distance from the stained to the lethal tissue boundaries (margin). We examined each slide for evidence of 'heat pump' effect. Results. A total of 126 measurements of the margin (visible 'ice-ball'-lethal margin) were made. These measurements were obtained from 29 slides prepared from the 6 test animals. Mean width was 0.75 {+-} 0.44 mm (maximum 1.15 {+-} 0.51 mm). It was found to increase adjacent to large blood vessels. No 'heat pump' effect was noted within the lethal zone. Data are limited to normal swine renal tissue. Conclusions. Considering the effects of the 'heat pump' phenomenon for normal renal tissue, the margin was measured to be 1.15 {+-} 0.51 mm. To approximate the efficacy of the 'gold standard' (partial nephrectomy, {approx}98 %), a minimum margin of 3 mm is recommended (3 Multiplication-Sign SD). Given these assumptions and extrapolating for renal cancer, which reportedly is more cryoresistant with a lethal temperature of -40 Degree-Sign C, the recommended margin is 6 mm.« less
Development of microarray device for functional evaluation of PC12D cell axonal extension ability
NASA Astrophysics Data System (ADS)
Nakamachi, Eiji; Yanagimoto, Junpei; Murakami, Shinya; Morita, Yusuke
2014-04-01
In this study, we developed a microarray bio-MEMS device that could trap PC12D (rat pheochromocytoma cells) cells to examine the intercellular interaction effect on the cell activation and the axonal extension ability. This is needed to assign particular patterns of PC12D cells to establish a cell functional evaluation technique. This experimental observation-based technique can be used for design of the cell sheet and scaffold for peripheral and central nerve regeneration. We have fabricated a micropillar-array bio-MEMS device, whose diameter was approximately 10 μm, by using thick photoresist SU-8 on the glass slide substrate. A maximum trapped PC12D cell ratio, 48.5%, was achieved. Through experimental observation of patterned PC12D "bi-cells" activation, we obtained the following results. Most of the PC12D "bi-cells" which had distances between 40 and 100 μm were connected after 24 h with a high probability. On the other hand, "bi-cells" which had distances between 110 and 200 μm were not connected. In addition, we measured axonal extension velocities in cases where the intercellular distance was between 40 and 100 μm. A maximum axonal extension velocity, 86.4 μm/h, was obtained at the intercellular distance of 40 μm.
Limitations and tradeoffs in synchronization of large-scale networks with uncertain links
Diwadkar, Amit; Vaidya, Umesh
2016-01-01
The synchronization of nonlinear systems connected over large-scale networks has gained popularity in a variety of applications, such as power grids, sensor networks, and biology. Stochastic uncertainty in the interconnections is a ubiquitous phenomenon observed in these physical and biological networks. We provide a size-independent network sufficient condition for the synchronization of scalar nonlinear systems with stochastic linear interactions over large-scale networks. This sufficient condition, expressed in terms of nonlinear dynamics, the Laplacian eigenvalues of the nominal interconnections, and the variance and location of the stochastic uncertainty, allows us to define a synchronization margin. We provide an analytical characterization of important trade-offs between the internal nonlinear dynamics, network topology, and uncertainty in synchronization. For nearest neighbour networks, the existence of an optimal number of neighbours with a maximum synchronization margin is demonstrated. An analytical formula for the optimal gain that produces the maximum synchronization margin allows us to compare the synchronization properties of various complex network topologies. PMID:27067994
Design of an ultrasonic micro-array for near field sensing during retinal microsurgery.
Clarke, Clyde; Etienne-Cummings, Ralph
2006-01-01
A method for obtaining the optimal and specific sensor parameters for a tool-tip mountable ultrasonic transducer micro-array is presented. The ultrasonic transducer array sensor parameters, such as frequency of operation, element size, inter-element spacing, number of elements and transducer geometry are obtained using a quadratic programming method to obtain a maximum directivity while being constrained to a total array size of 4 mm2 and the required resolution for retinal imaging. The technique is used to design a uniformly spaced NxN transducer array that is capable of resolving structures in the retina that are as small as 2 microm from a distance of 100 microm. The resultant 37x37 array of 16 microm transducers with 26 microm spacing will be realized as a Capacitive Micromachined Ultrasonic Transducer (CMUT) array and used for imaging and robotic guidance during retinal microsurgery.
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Raman mediated all-optical cascadable inverter using silicon-on-insulator waveguides.
Sen, Mrinal; Das, Mukul K
2013-12-01
In this Letter, we propose an all-optical circuit for a cascadable and integrable logic inverter based on stimulated Raman scattering. A maximum product criteria for noise margin is taken to analyze the cascadability of the inverter. Variation of noise margin for different model parameters is also studied. Finally, the time domain response of the inverter is analyzed for different widths of input pulses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittauer, K; Geurts, M; Toya, R
Purpose: Radiotherapy for gastric and gastroesophageal junction (GEJ) tumors commonly requires large margins due to deformation, motion and variable changes of the stomach anatomy, at the risk of increased normal tissue toxicities. This work quantifies the interfraction variation of stomach deformation from daily MRI-guided radiotherapy to allow for a more targeted determination of margin expansion in the treatment of gastric and GEJ tumors. Methods: Five patients treated for gastric (n=3) and gastroesophageal junction (n=2) cancers with conventionally fractionated radiotherapy underwent daily MR imaging on a clinical MR-IGRT system. Treatment planning and contours were performed based on the MR simulation. Themore » stomach was re-contoured on each daily volumetric setup MR. Dice similarity coefficients (DSC) of the daily stomach were computed to evaluate the stomach interfraction deformation. To evaluate the stomach margin, the maximum Hausdorff distance (HD) between the initial and fractional stomach surface was measured for each fraction. The margin expansion, needed to encompass all fractions, was evaluated from the union of all fractional stomachs. Results: In total, 94 fractions with daily stomach contours were evaluated. For the interfraction stomach differences, the average DSC was 0.67±0.1 for gastric and 0.62±0.1 for GEJ cases. The maximum HD of each fraction was 3.5±2.0cm (n=94) with mean HD of 0.8±0.4cm (across all surface voxels for all fractions). The margin expansion required to encompass all individual fractions (averaged across 5 patients) was 1.4 cm(superior), 2.3 cm(inferior), 2.5 cm(right), 3.2 cm(left), 3.7 cm(anterior), 3.4 cm(posterior). Maximum observed difference for margin expansion was 8.7cm(posterior) among one patient. Conclusion: We observed a notable interfractional change in daily stomach shape (i.e., mean DSC of 0.67, p<0.0001) in both gastric and GEJ patients, for which adaptive radiotherapy is indicated. A minimum PTV margin of 3 cm is indicated to account for interfraction stomach changes when adaptive radiotherapy is not available. M. Bassetti: Travel funding from ViewRay, Inc.« less
ERIC Educational Resources Information Center
Kelderman, Henk
1992-01-01
Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…
NASA Astrophysics Data System (ADS)
Finzel, Emily S.; Enkelmann, Eva
2017-04-01
The Cook Inlet in south-central Alaska contains the early Oligocene to Recent stratigraphic record of a fore-arc basin adjacent to a shallowly subducting oceanic plateau. Our new measured stratigraphic sections and detrital zircon U-Pb geochronology and Hf isotopes from Neogene strata and modern rivers illustrate the effects of flat-slab subduction on the depositional environments, provenance, and subsidence in fore-arc sedimentary systems. During the middle Miocene, fluvial systems emerged from the eastern, western, and northern margins of the basin. The axis of maximum subsidence was near the center of the basin, suggesting equal contributions from subsidence drivers on both margins. By the late Miocene, the axis of maximum subsidence had shifted westward and fluvial systems originating on the eastern margin of the basin above the flat-slab traversed the entire width of the basin. These mud-dominated systems reflect increased sediment flux from recycling of accretionary prism strata. Fluvial systems with headwaters above the flat-slab region continued to cross the basin during Pliocene time, but a change to sandstone-dominated strata with abundant volcanogenic grains signals a reactivation of the volcanic arc. The axis of maximum basin subsidence during late Miocene to Pliocene time is parallel to the strike of the subducting slab. Our data suggest that the character and strike-orientation of the down-going slab may provide a fundamental control on the nature of depositional systems, location of dominant provenance regions, and areas of maximum subsidence in fore-arc basins.
The use of functionally graded dental crowns to improve biocompatibility: a finite element analysis.
Mahmoudi, Mojtaba; Saidi, Ali Reza; Hashemipour, Maryam Alsadat; Amini, Parviz
2018-02-01
In post-core crown restorations, the significant mismatch between stiffness of artificial crowns and dental tissues leads to stress concentration at the interfaces. The aim of the present study was to reduce the destructive stresses by using a class of inhomogeneous materials called functionally graded materials (FGMs). For the purpose of the study, a 3-dimentional computer model of a premolar tooth and its surrounding tissues were generated. A post-core crown restoration with various crown materials, homogenous and FGM materials, were simulated and analyzed by finite element method. Finite element and statistical analysis showed that, in case of oblique loading, a significant difference (p < 0.05) was found at the maximum von Mises stresses of the crown margin between FGM and homogeneous crowns. The maximum von Mises stresses of the crown margin generated by FGM crowns were lower than those generated by homogenous crowns (70.8 vs. 46.3 MPa) and alumina crown resulted in the highest von Mises stress at the crown margin (77.7 MPa). Crown materials of high modulus of elasticity produced high stresses at the cervical region. FGM crowns may reduce the stress concentration at the cervical margins and consequently reduce the possibility of fracture.
Maximum Margin Clustering of Hyperspectral Data
NASA Astrophysics Data System (ADS)
Niazmardi, S.; Safari, A.; Homayouni, S.
2013-09-01
In recent decades, large margin methods such as Support Vector Machines (SVMs) are supposed to be the state-of-the-art of supervised learning methods for classification of hyperspectral data. However, the results of these algorithms mainly depend on the quality and quantity of available training data. To tackle down the problems associated with the training data, the researcher put effort into extending the capability of large margin algorithms for unsupervised learning. One of the recent proposed algorithms is Maximum Margin Clustering (MMC). The MMC is an unsupervised SVMs algorithm that simultaneously estimates both the labels and the hyperplane parameters. Nevertheless, the optimization of the MMC algorithm is a non-convex problem. Most of the existing MMC methods rely on the reformulating and the relaxing of the non-convex optimization problem as semi-definite programs (SDP), which are computationally very expensive and only can handle small data sets. Moreover, most of these algorithms are two-class classification, which cannot be used for classification of remotely sensed data. In this paper, a new MMC algorithm is used that solve the original non-convex problem using Alternative Optimization method. This algorithm is also extended for multi-class classification and its performance is evaluated. The results of the proposed algorithm show that the algorithm has acceptable results for hyperspectral data clustering.
Cost-Effectiveness of Old and New Technologies for Aneuploidy Screening.
Sinkey, Rachel G; Odibo, Anthony O
2016-06-01
Cost-effectiveness analyses allow assessment of whether marginal gains from new technology are worth increased costs. Several studies have examined cost-effectiveness of Down syndrome (DS) screening and found it to be cost-effective. Noninvasive prenatal screening also appears to be cost-effective among high-risk women with respect to DS screening, but not for the general population. Chromosomal microarray (CMA) is a genetic sequencing method superior to but more expensive than karyotype. In light of CMAs greater ability to detect genetic abnormalities, it is cost-effective when used for prenatal diagnosis of an anomalous fetus. This article covers methodology and salient issues of cost-effectiveness. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lane, Timothy; Roberts, David; Rea, Brice; Cofaigh, Colm Ó.; Vieli, Andreas
2013-04-01
At the Last Glacial Maximum (LGM), the Uummannaq Ice Stream System comprised a series coalescent outlet glaciers which extended along the trough to the shelf edge, draining a large proportion of the West Greenland Ice Sheet. Geomorphological mapping, terrestrial cosmogenic nuclide (TCN) exposure dating, and radiocarbon dating constrain warm-based ice stream activity in the north of the system to 1400 m a.s.l. during the LGM. Intervening plateaux areas (~ 2000 m a.s.l.) either remained ice free, or were covered by cold-based icefields, preventing diffluent or confluent flow throughout the inner to outer fjord region. Beyond the fjords, a topographic sill north of Ubekendt Ejland prevented the majority of westward ice flow, forcing it south through Igdlorssuit Sund, and into the Uummannaq Trough. Here it coalesced with ice from the south, forming the trunk zone of the UISS. Deglaciation of the UISS began at 14.9 cal. ka BP, rapidly retreating through the overdeepened Uummannaq Trough. Once beyond Ubekendt Ejland, the northern UISS retreated northwards, separating from the south. Retreat continued, and ice reached the present fjord confines in northern Uummannaq by 11.6 kyr. Both geomorphological (termino-lateral moraines) and geochronological (14C and TCN) data provide evidence for an ice marginal stabilisation at within Karrat-Rink Fjord, at Karrat Island, from 11.6-6.9 kyr. The Karrat moraines appear similar in both fjord position and form to 'Fjord Stade' moraines identified throughout West Greenland. Though chronologies constraining moraine formation are overlapping (Fjord Stade moraines - 9.3-8.2 kyr, Karrat moraines - 11.6-6.9 kyr), these moraines have not been correlated. This ice margin stabilisation was able to persist during the Holocene Thermal Maximum (~7.2 - 5 kyr). It overrode climatic and oceanic forcings, remaining on Karrat Island throughout peaks of air temperature and relative sea-level, and during the influx of the warm West Greenland Current into the Uummannaq region. Based upon analysis of fjord bathymetry and width, this ice marginal stabilisation has been shown to have been caused by increases in topographic constriction at Karrat Island. The location of the marginal stillstand is coincident with a dramatic narrowing of fjord width and bed shallowing. These increases in local lateral resistance reduces the ice flux necessary to maintain a stable grounding line, leading to ice margin stabilisation. This acted to negate the effects of the Holocene Thermal Maximum. Following this stabilisation, retreat within Rink-Karrat Fjord continued, driven by calving into the overdeepened Rink Fjord. Rink Isbræ reached its present ice margin or beyond after 5 kyr, during the Neoglacial. In contrast, the southern UISS reached its present margin at 8.7 kyr and Jakobshavn Isbræ reached its margin by 7 kyr. This work therefore provides compelling evidence for topographically forced asynchronous, non-linear ice stream retreat between outlet glaciers in West Greenland. In addition, it has major implications for our understanding and reconstruction of mid-Holocene ice sheet extent, and ice sheet dynamics during the Holocene Thermal Maximum to Neoglacial switch.
A quantitative analysis of transtensional margin width
NASA Astrophysics Data System (ADS)
Jeanniot, Ludovic; Buiter, Susanne J. H.
2018-06-01
Continental rifted margins show variations between a few hundred to almost a thousand kilometres in their conjugated widths from the relatively undisturbed continent to the oceanic crust. Analogue and numerical modelling results suggest that the conjugated width of rifted margins may have a relationship to their obliquity of divergence, with narrower margins occurring for higher obliquity. We here test this prediction by analysing the obliquity and rift width for 26 segments of transtensional conjugate rifted margins in the Atlantic and Indian Oceans. We use the plate reconstruction software GPlates (http://www.gplates.org) for different plate rotation models to estimate the direction and magnitude of rifting from the initial phases of continental rifting until breakup. Our rift width corresponds to the distance between the onshore maximum topography and the last identified continental crust. We find a weak positive correlation between the obliquity of rifting and rift width. Highly oblique margins are narrower than orthogonal margins, as expected from analogue and numerical models. We find no relationships between rift obliquities and rift duration nor the presence or absence of Large Igneous Provinces (LIPs).
A new measure for gene expression biclustering based on non-parametric correlation.
Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja
2013-12-01
One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Intra-fraction motion of larynx radiotherapy
NASA Astrophysics Data System (ADS)
Durmus, Ismail Faruk; Tas, Bora
2018-02-01
In early stage laryngeal radiotherapy, movement is an important factor. Thyroid cartilage can move from swallowing, breathing, sound and reflexes. The effects of this motion on the target volume (PTV) during treatment were examined. In our study, the target volume movement during the treatment for this purpose was examined. Thus, setup margins are re-evaluated and patient-based PTV margins are determined. Intrafraction CBCT was scanned in 246 fractions for 14 patients. During the treatment, the amount of deviation which could be lateral, vertical and longitudinal axis was determined. ≤ ± 0.1cm deviation; 237 fractions in the lateral direction, 202 fractions in the longitudinal direction, 185 fractions in the vertical direction. The maximum deviation values were found in the longitudinal direction. Intrafraction guide in laryngeal radiotherapy; we are sure of the correctness of the treatment, the target volume is to adjust the margin and dose more precisely, we control the maximum deviation of the target volume for each fraction. Although the image quality of intrafraction-CBCT scans was lower than the image quality of planning CT, they showed sufficient contrast for this work.
On Determining the Rise, Size, and Duration Classes of a Sunspot Cycle
NASA Astrophysics Data System (ADS)
Wilson, Robert M.; Hathaway, David H.; Reichmann, Edwin J.
1996-09-01
The behavior of ascent duration, maximum amplitude, and period for cycles 1 to 21 suggests that they are not mutually independent. Analysis of the resultant three-dimensional contingency table for cycles divided according to rise time (ascent duration), size (maximum amplitude), and duration (period) yields a chi-square statistic (= 18.59) that is larger than the test statistic (= 9.49 for 4 degrees-of-freedom at the 5-percent level of significance), thereby, inferring that the null hypothesis (mutual independence) can be rejected. Analysis of individual 2 by 2 contingency tables (based on Fisher's exact test) for these parameters shows that, while ascent duration is strongly related to maximum amplitude in the negative sense (inverse correlation) - the Waldmeier effect, it also is related (marginally) to period, but in the positive sense (direct correlation). No significant (or marginally significant) correlation is found between period and maximum amplitude. Using cycle 22 as a test case, we show that by the 12th month following conventional onset, cycle 22 appeared highly likely to be a fast-rising, larger-than-average-size cycle. Because of the inferred correlation between ascent duration and period, it also seems likely that it will have a period shorter than average length.
On Determining the Rise, Size, and Duration Classes of a Sunspot Cycle
NASA Technical Reports Server (NTRS)
Wilson, Robert M.; Hathaway, David H.; Reichmann, Edwin J.
1996-01-01
The behavior of ascent duration, maximum amplitude, and period for cycles 1 to 21 suggests that they are not mutually independent. Analysis of the resultant three-dimensional contingency table for cycles divided according to rise time (ascent duration), size (maximum amplitude), and duration (period) yields a chi-square statistic (= 18.59) that is larger than the test statistic (= 9.49 for 4 degrees-of-freedom at the 5-percent level of significance), thereby, inferring that the null hypothesis (mutual independence) can be rejected. Analysis of individual 2 by 2 contingency tables (based on Fisher's exact test) for these parameters shows that, while ascent duration is strongly related to maximum amplitude in the negative sense (inverse correlation) - the Waldmeier effect, it also is related (marginally) to period, but in the positive sense (direct correlation). No significant (or marginally significant) correlation is found between period and maximum amplitude. Using cycle 22 as a test case, we show that by the 12th month following conventional onset, cycle 22 appeared highly likely to be a fast-rising, larger-than-average-size cycle. Because of the inferred correlation between ascent duration and period, it also seems likely that it will have a period shorter than average length.
System Architecture of Small Unmanned Aerial System for Flight Beyond Visual Line-of-Sight
2015-09-17
Signal Strength PT = Transmitter Power GT = Transmitter antenna gain LT = Transmitter loss Lp = Propagation loss GR = Receiver antenna...gain (dBi) LR(db) = Receiver losses (dB) 15 Lm = Link margin (dB) PT = Transmitter Power (dBm) GT = Transmitter antenna gain (dBi) LT... Transmitter loss (dB) The maximum range is determined by four components, 1) Transmission, 2) Propagation, 3) Reception and 4) Link Margin
Schwab, Joshua; Gruber, Susan; Blaser, Nello; Schomaker, Michael; van der Laan, Mark
2015-01-01
This paper describes a targeted maximum likelihood estimator (TMLE) for the parameters of longitudinal static and dynamic marginal structural models. We consider a longitudinal data structure consisting of baseline covariates, time-dependent intervention nodes, intermediate time-dependent covariates, and a possibly time-dependent outcome. The intervention nodes at each time point can include a binary treatment as well as a right-censoring indicator. Given a class of dynamic or static interventions, a marginal structural model is used to model the mean of the intervention-specific counterfactual outcome as a function of the intervention, time point, and possibly a subset of baseline covariates. Because the true shape of this function is rarely known, the marginal structural model is used as a working model. The causal quantity of interest is defined as the projection of the true function onto this working model. Iterated conditional expectation double robust estimators for marginal structural model parameters were previously proposed by Robins (2000, 2002) and Bang and Robins (2005). Here we build on this work and present a pooled TMLE for the parameters of marginal structural working models. We compare this pooled estimator to a stratified TMLE (Schnitzer et al. 2014) that is based on estimating the intervention-specific mean separately for each intervention of interest. The performance of the pooled TMLE is compared to the performance of the stratified TMLE and the performance of inverse probability weighted (IPW) estimators using simulations. Concepts are illustrated using an example in which the aim is to estimate the causal effect of delayed switch following immunological failure of first line antiretroviral therapy among HIV-infected patients. Data from the International Epidemiological Databases to Evaluate AIDS, Southern Africa are analyzed to investigate this question using both TML and IPW estimators. Our results demonstrate practical advantages of the pooled TMLE over an IPW estimator for working marginal structural models for survival, as well as cases in which the pooled TMLE is superior to its stratified counterpart. PMID:25909047
Novel maximum-margin training algorithms for supervised neural networks.
Ludwig, Oswaldo; Nunes, Urbano
2010-06-01
This paper proposes three novel training methods, two of them based on the backpropagation approach and a third one based on information theory for multilayer perceptron (MLP) binary classifiers. Both backpropagation methods are based on the maximal-margin (MM) principle. The first one, based on the gradient descent with adaptive learning rate algorithm (GDX) and named maximum-margin GDX (MMGDX), directly increases the margin of the MLP output-layer hyperplane. The proposed method jointly optimizes both MLP layers in a single process, backpropagating the gradient of an MM-based objective function, through the output and hidden layers, in order to create a hidden-layer space that enables a higher margin for the output-layer hyperplane, avoiding the testing of many arbitrary kernels, as occurs in case of support vector machine (SVM) training. The proposed MM-based objective function aims to stretch out the margin to its limit. An objective function based on Lp-norm is also proposed in order to take into account the idea of support vectors, however, overcoming the complexity involved in solving a constrained optimization problem, usually in SVM training. In fact, all the training methods proposed in this paper have time and space complexities O(N) while usual SVM training methods have time complexity O(N (3)) and space complexity O(N (2)) , where N is the training-data-set size. The second approach, named minimization of interclass interference (MICI), has an objective function inspired on the Fisher discriminant analysis. Such algorithm aims to create an MLP hidden output where the patterns have a desirable statistical distribution. In both training methods, the maximum area under ROC curve (AUC) is applied as stop criterion. The third approach offers a robust training framework able to take the best of each proposed training method. The main idea is to compose a neural model by using neurons extracted from three other neural networks, each one previously trained by MICI, MMGDX, and Levenberg-Marquard (LM), respectively. The resulting neural network was named assembled neural network (ASNN). Benchmark data sets of real-world problems have been used in experiments that enable a comparison with other state-of-the-art classifiers. The results provide evidence of the effectiveness of our methods regarding accuracy, AUC, and balanced error rate.
NASA Astrophysics Data System (ADS)
Phillips, Emrys; Cotterill, Carol; Johnson, Kirstin; Crombie, Kirstin; James, Leo; Carr, Simon; Ruiter, Astrid
2018-01-01
High resolution seismic data from the Dogger Bank in the central southern North Sea has revealed that the Dogger Bank Formation records a complex history of sedimentation and penecontemporaneous, large-scale, ice-marginal to proglacial glacitectonic deformation. These processes led to the development of a large thrust-block moraine complex which is buried beneath a thin sequence of Holocene sediments. This buried glacitectonic landsystem comprises a series of elongate, arcuate moraine ridges (200 m up to > 15 km across; over 40-50 km long) separated by low-lying ice marginal to proglacial sedimentary basins and/or meltwater channels, preserving the shape of the margin of this former ice sheet. The moraines are composed of highly deformed (folded and thrust) Dogger Bank Formation with the lower boundary of the deformed sequence (up to 40-50 m thick) being marked by a laterally extensive décollement. The ice-distal parts of the thrust moraine complex are interpreted as a "forward" propagating imbricate thrust stack developed in response to S/SE-directed ice-push. The more complex folding and thrusting within the more ice-proximal parts of the thrust-block moraines record the accretion of thrust slices of highly deformed sediment as the ice repeatedly reoccupied this ice marginal position. Consequently, the internal structure of the Dogger Bank thrust-moraine complexes can be directly related to ice sheet dynamics, recording the former positions of a highly dynamic, oscillating Weichselian ice sheet margin as it retreated northwards at the end of the Last Glacial Maximum.
Severino, Patricia; Alvares, Adriana M; Michaluart, Pedro; Okamoto, Oswaldo K; Nunes, Fabio D; Moreira-Filho, Carlos A; Tajara, Eloiza H
2008-01-01
Background Oral squamous cell carcinoma (OSCC) is a frequent neoplasm, which is usually aggressive and has unpredictable biological behavior and unfavorable prognosis. The comprehension of the molecular basis of this variability should lead to the development of targeted therapies as well as to improvements in specificity and sensitivity of diagnosis. Results Samples of primary OSCCs and their corresponding surgical margins were obtained from male patients during surgery and their gene expression profiles were screened using whole-genome microarray technology. Hierarchical clustering and Principal Components Analysis were used for data visualization and One-way Analysis of Variance was used to identify differentially expressed genes. Samples clustered mostly according to disease subsite, suggesting molecular heterogeneity within tumor stages. In order to corroborate our results, two publicly available datasets of microarray experiments were assessed. We found significant molecular differences between OSCC anatomic subsites concerning groups of genes presently or potentially important for drug development, including mRNA processing, cytoskeleton organization and biogenesis, metabolic process, cell cycle and apoptosis. Conclusion Our results corroborate literature data on molecular heterogeneity of OSCCs. Differences between disease subsites and among samples belonging to the same TNM class highlight the importance of gene expression-based classification and challenge the development of targeted therapies. PMID:19014556
[Effect of nasal CPAP on human diaphragm position and lung volume].
Yoshimura, N; Abe, T; Kusuhara, N; Tomita, T
1994-11-01
The cephalic margin of the zone of apposition (ZOA) was observed with ultrasonography at ambient pressure and during nasal continuous positive airway pressure (nasal CPAP) in nine awake healthy males in a supine position. In a relaxed state at ambient pressure, there was a significant (p < 0.001) linear relationship between lung volume and the movement of the cephalic margin of the ZOA over the range from maximum expiratory position (MEP) to maximum inspiratory position (MIP). With nasal CPAP, functional residual capacity increased significantly (p < 0.01) in proportion to the increase in CPAP. At 20 cmH2O CPAP, the mean increase in volume at end expiration was 36% of the vital capacity measured at ambient pressure. The cephalic margin of the ZOA moved significantly (p < 0.01) in a caudal direction as CPAP was increased. At 20 cmH2O CPAP, the cephalic margin of the ZOA at end expiratory position (EEP) had moved 55% of the difference from MIP to MEP measured at ambient pressure. The end expiratory diaphragm position during nasal CPAP was lower than the diaphragm position at ambient pressure when lung volumes were equal. These results suggest that during nasal CPAP the chest wall is distorted from its relaxed configuration, with a decrease in rib cage expansion and an increase in outward displacement of the abdominal wall.
Overcoming confounded controls in the analysis of gene expression data from microarray experiments.
Bhattacharya, Soumyaroop; Long, Dang; Lyons-Weiler, James
2003-01-01
A potential limitation of data from microarray experiments exists when improper control samples are used. In cancer research, comparisons of tumour expression profiles to those from normal samples is challenging due to tissue heterogeneity (mixed cell populations). A specific example exists in a published colon cancer dataset, in which tissue heterogeneity was reported among the normal samples. In this paper, we show how to overcome or avoid the problem of using normal samples that do not derive from the same tissue of origin as the tumour. We advocate an exploratory unsupervised bootstrap analysis that can reveal unexpected and undesired, but strongly supported, clusters of samples that reflect tissue differences instead of tumour versus normal differences. All of the algorithms used in the analysis, including the maximum difference subset algorithm, unsupervised bootstrap analysis, pooled variance t-test for finding differentially expressed genes and the jackknife to reduce false positives, are incorporated into our online Gene Expression Data Analyzer ( http:// bioinformatics.upmc.edu/GE2/GEDA.html ).
Identification of consensus biomarkers for predicting non-genotoxic hepatocarcinogens
Huang, Shan-Han; Tung, Chun-Wei
2017-01-01
The assessment of non-genotoxic hepatocarcinogens (NGHCs) is currently relying on two-year rodent bioassays. Toxicogenomics biomarkers provide a potential alternative method for the prioritization of NGHCs that could be useful for risk assessment. However, previous studies using inconsistently classified chemicals as the training set and a single microarray dataset concluded no consensus biomarkers. In this study, 4 consensus biomarkers of A2m, Ca3, Cxcl1, and Cyp8b1 were identified from four large-scale microarray datasets of the one-day single maximum tolerated dose and a large set of chemicals without inconsistent classifications. Machine learning techniques were subsequently applied to develop prediction models for NGHCs. The final bagging decision tree models were constructed with an average AUC performance of 0.803 for an independent test. A set of 16 chemicals with controversial classifications were reclassified according to the consensus biomarkers. The developed prediction models and identified consensus biomarkers are expected to be potential alternative methods for prioritization of NGHCs for further experimental validation. PMID:28117354
Viray, Hollis; Bradley, William R; Schalper, Kurt A; Rimm, David L; Gould Rothberg, Bonnie E
2013-08-01
The distribution of the standard melanoma antibodies S100, HMB-45, and Melan-A has been extensively studied. Yet, the overlap in their expression is less well characterized. To determine the joint distributions of the classic melanoma markers and to determine if classification according to joint antigen expression has prognostic relevance. S100, HMB-45, and Melan-A were assayed by immunofluorescence-based immunohistochemistry on a large tissue microarray of 212 cutaneous melanoma primary tumors and 341 metastases. Positive expression for each antigen required display of immunoreactivity for at least 25% of melanoma cells. Marginal and joint distributions were determined across all markers. Bivariate associations with established clinicopathologic covariates and melanoma-specific survival analyses were conducted. Of 322 assayable melanomas, 295 (91.6%), 203 (63.0%), and 236 (73.3%) stained with S100, HMB-45, and Melan-A, respectively. Twenty-seven melanomas, representing a diverse set of histopathologic profiles, were S100 negative. Coexpression of all 3 antibodies was observed in 160 melanomas (49.7%). Intensity of endogenous melanin pigment did not confound immunolabeling. Among primary tumors, associations with clinicopathologic parameters revealed a significant relationship only between HMB-45 and microsatellitosis (P = .02). No significant differences among clinicopathologic criteria were observed across the HMB-45/Melan-A joint distribution categories. Neither marginal HMB-45 (P = .56) nor Melan-A (P = .81), or their joint distributions (P = .88), was associated with melanoma-specific survival. Comprehensive characterization of the marginal and joint distributions for S100, HMB-45, and Melan-A across a large series of cutaneous melanomas revealed diversity of expression across this group of antigens. However, these immunohistochemically defined subclasses of melanomas do not significantly differ according to clinicopathologic correlates or outcome.
Multilayer Disk Reduced Interlayer Crosstalk with Wide Disk-Fabrication Margin
NASA Astrophysics Data System (ADS)
Hirotsune, Akemi; Miyauchi, Yasushi; Endo, Nobumasa; Onuma, Tsuyoshi; Anzai, Yumiko; Kurokawa, Takahiro; Ushiyama, Junko; Shintani, Toshimichi; Sugiyama, Toshinori; Miyamoto, Harukazu
2008-07-01
To reduce interlayer crosstalk caused by the ghost spot which appears in a multilayer optical disk with more than three information layers, a multilayer disk structure which reduces interlayer crosstalk with a wide disk-fabrication margin was proposed in which the backward reflectivity of the information layers is sufficiently low. It was confirmed that the interlayer crosstalk caused by the ghost spot was reduced to less than the crosstalk from the adjacent layer by controlling backward reflectivity. The wide disk-fabrication margin of the proposed disk structure was indicated by experimentally confirming that the tolerance of the maximum deviation of the spacer-layer thickness is four times larger than that in the previous multilayer disk.
Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.
Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R
2006-02-28
The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.
NASA Astrophysics Data System (ADS)
Somoza, Luis; Medialdea, Teresa; Vázquez, Juan T.; González, Francisco J.; León, Ricardo; Palomino, Desiree; Fernández-Salas, Luis M.; Rengel, Juan
2017-04-01
Spain presented on 11 May 2009 a partial submission for delimiting the extended Continental Shelf in respect to the area of Galicia to the Commission on the Limits of the Continental Shelf (CLCS). The Galicia margin represents an example of the transition between two different types of continental margins (CM): a western hyperpextended margin and a northern convergent margin in the Bay of Biscay. The western Galicia Margin (wGM 41° to 43° N) corresponds to a hyper-extended rifted margin as result of the poly-phase development of the Iberian-Newfoundland conjugate margin during the Mesozoic. Otherwise, the north Galicia Margin (nGM) is the western end of the Cenozoic subduction of the Bay of Biscay along the north Iberian Margin (NIM) linked to the Pyrenean-Mediterranean collisional belt Following the procedure established by the CLCS Scientific and Technical Guidelines (CLCS/11), the points of the Foot of Slope (FoS) has to be determined as the points of maximum change in gradient in the region defined as the Base of the continental Slope (BoS). Moreover, the CLCS guidelines specify that the BoS should be contained within the continental margin (CM). In this way, a full-coverage multibeam bathymetry and an extensive dataset of up 4,736 km of multichannel seismic profiles were expressly obtained during two oceanographic surveys (Breogham-2005 and Espor-2008), aboard the Spanish research vessel Hespérides, to map the outer limit of the CM.In order to follow the criteria of the CLCS guidelines, two types of models reported in the CLCS Guidelines were applied to the Galicia Margin. In passive margins, the Commission's guidelines establish that the natural prolongation is based on that "the natural process by which a continent breaks up prior to the separation by seafloor spreading involves thinning, extension and rifting of the continental crust…" (para. 7.3, CLCS/11). The seaward extension of the wGM should include crustal continental blocks and the so-called Peridotite Ridge (PR), composed by serpentinized exhumed continental mantle. Thus, the PR should be regarded as a natural component of the continental margin since these seafloor highs were formed by hyperextension of the margin. Regarding convergent margins, the architecture of the nGM can be classified according the CLCS/11 as a "poor- or non-accretionary convergent continental margin" characterized by a poorly developed accretionary wedge, which is composed of: a large sedimentary apron mainly formed by large slumps and thrust wedges of igneous (ophiolitic/continental) body overlying subducting oceanic crust (Fig. 6.1B, CLCS/11). According to para. 6.3.6. (CLCS/11), the seaward extent of this type of continental convergent margins is defined by the seaward edge of the accretionary wedge. Applying this definition, the seaward extent of the margin is defined by the outer limit of the ophiolitic deformed body that marks the edge of the accretionary wedge. These geological criteria were strictly applied for mapping the BoS region, where the FoS were determinate by using the maximum change in gradient within this mapped region. Acknowledgments: Project for the Extension of the Spanish Continental according UNCLOS (CTM2010-09496-E) and Project CTM2016-75947-R
2013-01-01
Background Zirconia materials are known for their optimal aesthetics, but they are brittle, and concerns remain about whether their mechanical properties are sufficient for withstanding the forces exerted in the oral cavity. Therefore, this study compared the maximum deformation and failure forces of titanium implants between titanium-alloy and zirconia abutments under oblique compressive forces in the presence of two levels of marginal bone loss. Methods Twenty implants were divided into Groups A and B, with simulated bone losses of 3.0 and 1.5 mm, respectively. Groups A and B were also each divided into two subgroups with five implants each: (1) titanium implants connected to titanium-alloy abutments and (2) titanium implants connected to zirconia abutments. The maximum deformation and failure forces of each sample was determined using a universal testing machine. The data were analyzed using the nonparametric Mann–Whitney test. Results The mean maximum deformation and failure forces obtained the subgroups were as follows: A1 (simulated bone loss of 3.0 mm, titanium-alloy abutment) = 540.6 N and 656.9 N, respectively; A2 (simulated bone loss of 3.0 mm, zirconia abutment) = 531.8 N and 852.7 N; B1 (simulated bone loss of 1.5 mm, titanium-alloy abutment) = 1070.9 N and 1260.2 N; and B2 (simulated bone loss of 1.5 mm, zirconia abutment) = 907.3 N and 1182.8 N. The maximum deformation force differed significantly between Groups B1 and B2 but not between Groups A1 and A2. The failure force did not differ between Groups A1 and A2 or between Groups B1 and B2. The maximum deformation and failure forces differed significantly between Groups A1 and B1 and between Groups A2 and B2. Conclusions Based on this experimental study, the maximum deformation and failure forces are lower for implants with a marginal bone loss of 3.0 mm than of 1.5 mm. Zirconia abutments can withstand physiological occlusal forces applied in the anterior region. PMID:23688204
NASA Astrophysics Data System (ADS)
Lakeman, Thomas R.; England, John H.
2013-07-01
The study revises the maximum extent of the northwest Laurentide Ice Sheet (LIS) in the western Canadian Arctic Archipelago (CAA) during the last glaciation and documents subsequent ice sheet retreat and glacioisostatic adjustments across western Banks Island. New geomorphological mapping and maximum-limiting radiocarbon ages indicate that the northwest LIS inundated western Banks Island after ~ 31 14C ka BP and reached a terminal ice margin west of the present coastline. The onset of deglaciation and the age of the marine limit (22-40 m asl) are unresolved. Ice sheet retreat across western Banks Island was characterized by the withdrawal of a thin, cold-based ice margin that reached the central interior of the island by ~ 14 cal ka BP. The elevation of the marine limit is greater than previously recognized and consistent with greater glacioisostatic crustal unloading by a more expansive LIS. These results complement emerging bathymetric observations from the Arctic Ocean, which indicate glacial erosion during the Last Glacial Maximum (LGM) to depths of up to 450 m.
Learning monopolies with delayed feedback on price expectations
NASA Astrophysics Data System (ADS)
Matsumoto, Akio; Szidarovszky, Ferenc
2015-11-01
We call the intercept of the price function with the vertical axis the maximum price and the slope of the price function the marginal price. In this paper it is assumed that a monopolistic firm has full information about the marginal price and its own cost function but is uncertain on the maximum price. However, by repeated interaction with the market, the obtained price observations give a basis for an adaptive learning process of the maximum price. It is also assumed that the price observations have fixed delays, so the learning process can be described by a delayed differential equation. In the cases of one or two delays, the asymptotic behavior of the resulting dynamic process is examined, stability conditions are derived. Three main results are demonstrated in the two delay learning processes. First, it is possible to stabilize the equilibrium which is unstable in the one delay model. Second, complex dynamics involving chaos, which is impossible in the one delay model, can emerge. Third, alternations of stability and instability (i.e., stability switches) occur repeatedly.
Salem Milani, Amin; Rahimi, Saeed; Froughreyhani, Mohammad; Vahid Pakdel, Mahdi
2013-01-01
In various clinical situations, mineral trioxide aggregate (MTA) may come into direct contact or even be mixed with blood. The aim of the present study was to evaluate the effect of exposure to blood on marginal adaptation and surface microstructure of MTA. Thirty extracted human single-rooted teeth were used. Standard root canal treatment was carried out. Root-ends were resected, and retrocavities were prepared. The teeth were randomly divided into two groups (n = 15): in group 1, the internal surface of the cavities was coated with fresh blood. Then, the cavities were filled with MTA. The roots were immersed in molds containing fresh blood. In group 2, the aforementioned procedures were performed except that synthetic tissue fluid (STF) was used instead of blood. To assess the marginal adaptation, "gap perimeter" and "maximum gap width" were measured under scanning electron microscope. The surface microstructure was also examined. Independent samples t-test and Mann-Whitney U test were used to analyze the data. Maximum gap width and gap perimeter in the blood-exposed group were significantly larger than those in the STF-exposed group (p < 0.01). In the blood-exposed group, the crystals tended to be more rounded and less angular compared with the STF-exposed group, and there was a general lack of needle-like crystals. Exposure to blood during setting has a negative effect on marginal adaptation of MTA, and blood-exposed MTA has a different surface microstructure compared to STF-exposed MTA.
Adaptive Quadrature for Item Response Models. Research Report. ETS RR-06-29
ERIC Educational Resources Information Center
Haberman, Shelby J.
2006-01-01
Adaptive quadrature is applied to marginal maximum likelihood estimation for item response models with normal ability distributions. Even in one dimension, significant gains in speed and accuracy of computation may be achieved.
Maximum margin multiple instance clustering with applications to image and text clustering.
Zhang, Dan; Wang, Fei; Si, Luo; Li, Tao
2011-05-01
In multiple instance learning problems, patterns are often given as bags and each bag consists of some instances. Most of existing research in the area focuses on multiple instance classification and multiple instance regression, while very limited work has been conducted for multiple instance clustering (MIC). This paper formulates a novel framework, maximum margin multiple instance clustering (M(3)IC), for MIC. However, it is impractical to directly solve the optimization problem of M(3)IC. Therefore, M(3)IC is relaxed in this paper to enable an efficient optimization solution with a combination of the constrained concave-convex procedure and the cutting plane method. Furthermore, this paper presents some important properties of the proposed method and discusses the relationship between the proposed method and some other related ones. An extensive set of empirical results are shown to demonstrate the advantages of the proposed method against existing research for both effectiveness and efficiency.
MIXOR: a computer program for mixed-effects ordinal regression analysis.
Hedeker, D; Gibbons, R D
1996-03-01
MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.
Falk, Carl F; Cai, Li
2016-06-01
We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.
Zhu, Bohui; Ding, Yongsheng; Hao, Kuangrong
2013-01-01
This paper presents a novel maximum margin clustering method with immune evolution (IEMMC) for automatic diagnosis of electrocardiogram (ECG) arrhythmias. This diagnostic system consists of signal processing, feature extraction, and the IEMMC algorithm for clustering of ECG arrhythmias. First, raw ECG signal is processed by an adaptive ECG filter based on wavelet transforms, and waveform of the ECG signal is detected; then, features are extracted from ECG signal to cluster different types of arrhythmias by the IEMMC algorithm. Three types of performance evaluation indicators are used to assess the effect of the IEMMC method for ECG arrhythmias, such as sensitivity, specificity, and accuracy. Compared with K-means and iterSVR algorithms, the IEMMC algorithm reflects better performance not only in clustering result but also in terms of global search ability and convergence ability, which proves its effectiveness for the detection of ECG arrhythmias. PMID:23690875
Support Vector Machines for Differential Prediction
Kuusisto, Finn; Santos Costa, Vitor; Nassif, Houssam; Burnside, Elizabeth; Page, David; Shavlik, Jude
2015-01-01
Machine learning is continually being applied to a growing set of fields, including the social sciences, business, and medicine. Some fields present problems that are not easily addressed using standard machine learning approaches and, in particular, there is growing interest in differential prediction. In this type of task we are interested in producing a classifier that specifically characterizes a subgroup of interest by maximizing the difference in predictive performance for some outcome between subgroups in a population. We discuss adapting maximum margin classifiers for differential prediction. We first introduce multiple approaches that do not affect the key properties of maximum margin classifiers, but which also do not directly attempt to optimize a standard measure of differential prediction. We next propose a model that directly optimizes a standard measure in this field, the uplift measure. We evaluate our models on real data from two medical applications and show excellent results. PMID:26158123
Support Vector Machines for Differential Prediction.
Kuusisto, Finn; Santos Costa, Vitor; Nassif, Houssam; Burnside, Elizabeth; Page, David; Shavlik, Jude
Machine learning is continually being applied to a growing set of fields, including the social sciences, business, and medicine. Some fields present problems that are not easily addressed using standard machine learning approaches and, in particular, there is growing interest in differential prediction . In this type of task we are interested in producing a classifier that specifically characterizes a subgroup of interest by maximizing the difference in predictive performance for some outcome between subgroups in a population. We discuss adapting maximum margin classifiers for differential prediction. We first introduce multiple approaches that do not affect the key properties of maximum margin classifiers, but which also do not directly attempt to optimize a standard measure of differential prediction. We next propose a model that directly optimizes a standard measure in this field, the uplift measure. We evaluate our models on real data from two medical applications and show excellent results.
ERIC Educational Resources Information Center
Kelderman, Henk
In this paper, algorithms are described for obtaining the maximum likelihood estimates of the parameters in log-linear models. Modified versions of the iterative proportional fitting and Newton-Raphson algorithms are described that work on the minimal sufficient statistics rather than on the usual counts in the full contingency table. This is…
An Observational and Analytical Study of Marginal Ice Zone Atmospheric Jets
2016-12-01
layer or in the capping temperature inversion just above. The three strongest jets had maximum wind speeds at elevations near 350 m to 400 m...geostrophic wind due to horizontal temperature changes in the atmospheric boundary layer and capping inversion . The jets were detected using...temperature inversion just above. The three strongest jets had maximum wind speeds at elevations near 350 m to 400 m elevation; one of these jets had a
Design of simplified maximum-likelihood receivers for multiuser CPM systems.
Bing, Li; Bai, Baoming
2014-01-01
A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.
Ice-Sheet Glaciation of the Puget lowland, Washington, during the Vashon Stade (late pleistocene)
Thorson, R.M.
1980-01-01
During the Vashon Stade of the Fraser Glaciation, about 15,000-13,000 yr B.P., a lobe of the Cordilleran Ice Sheet occupied the Puget lowland of western Washington. At its maximum extent about 14,000 yr ago, the ice sheet extended across the Puget lowland between the Cascade Range and Olympic Mountains and terminated about 80 km south of Seattle. Meltwater streams drained southwest to the Pacific Ocean and built broad outwash trains south of the ice margin. Reconstructed longitudinal profiles for the Puget lobe at its maximum extent are similar to the modern profile of Malaspina Glacier, Alaska, suggesting that the ice sheet may have been in a near-equilibrium state at the glacial maximum. Progressive northward retreat from the terminal zone was accompanied by the development of ice-marginal streams and proglacial lakes that drained southward during initial retreat, but northward during late Vashon time. Relatively rapid retreat of the Juan de Fuca lobe may have contributed to partial stagnation of the northwestern part of the Puget lobe. Final destruction of the Puget lobe occurred when the ice retreated north of Admiralty Inlet. The sea entered the Puget lowland at this time, allowing the deposition of glacial-marine sediments which now occur as high as 50 m altitude. These deposits, together with ice-marginal meltwater channels presumed to have formed above sea level during deglaciation, suggest that a significant amount of postglacial isostatic and(or) tectonic deformation has occurred in the Puget lowland since deglaciation. ?? 1980.
NASA Astrophysics Data System (ADS)
Ranero, C. R.; Phipps Morgan, J.
2006-12-01
The existence of sudden along-strike transitions between volcanic and non-volcanic rifted margins is an important constraint for conceptual models of rifting and continental breakup. We think there is a promising indirect approach to infer the maximum width of the region of upwelling that exists beneath a rifted margin during the transition from rifting to seafloor-spreading. We infer this width of ~30km from the minimum length of the ridge-offsets that mark the limits of the `region of influence' of on-ridge plumes on the axial relief, axial morphology, and crustal thickness along the ridge and at the terminations of fossil volcanic rifted margins. We adopt Vogt's [1972] hypothesis for along-ridge asthenospheric flow in a narrow vertical slot beneath the axis of plume-influenced `macro-segments' and volcanic rifted margins. We find that: (1) There is a threshold distance to the lateral offsets that bound plume-influenced macrosegments; all such `barrier offsets' are greater than ~30km, while smaller offsets do not appear to be a barrier to along-axis flow. This pattern is seen in the often abrupt transitions between volcanic and non-volcanic rifted margins; these transitions coincide with >30km ridge offsets that mark the boundary between the smooth seafloor morphology and thick crust of a plume- influenced volcanic margin and a neighboring non-volcanic margin, as recorded in 180Ma rifting of the early N. Atlantic, the 42Ma rifting of the Kerguelen-Broken Ridge, and the 66Ma Seychelles-Indian rifting in the Indian Ocean. (2) A similar pattern is seen in the often abrupt transitions between `normal' and plume-influenced mid- ocean ridge segments, which is discussed in a companion presentation by Phipps Morgan and Ranero (this meeting). (3) The coexistance of adjacent volcanic and non-volcanic rifted margin segments is readily explained in this conceptual framework. If the volcanic margin macrosegment is plume-fed by hot asthenosphere along an axial ridge slot, while adjacent non-volcanic margin segments stretch and upwell ambient cooler subcontinental mantle, then there will be a sudden transition from volcanic to non-volcanic margins across a transform offset. (4) A 30km width for the region of ridge upwelling and melting offers a simple conceptual explanation for the apparent 30km threshold length for the existence of strike-slip transform faults and the occurrence of non-transform offsets at smaller ridge offset-distances. (5) The conceptual model leads to the interpretation of the observed characteristic ~1000km-2000km-width of plume-influenced macro- segments as a measure of the maximum potential plume supply into a subaxial slot of 5-10 cubic km per yr. (6) If asthenosphere consumption by plate-spreading is less than plume-supply into a macro-segment, then the shallow seafloor and excess gravitational spreading stresses associated with a plume-influenced ridge can lead to growth of the axial slot by ridge propagation. We think this is a promising conceptual framework with which to understand the differences between volcanic and non-volcanic rifted margins.
Quantitative measurement of marginal disintegration of ceramic inlays.
Hayashi, Mikako; Tsubakimoto, Yuko; Takeshige, Fumio; Ebisu, Shigeyuki
2004-01-01
The objectives of this study include establishing a method for quantitative measurement of marginal change in ceramic inlays and clarifying their marginal disintegration in vivo. An accurate CCD optical laser scanner system was used for morphological measurement of the marginal change of ceramic inlays. The accuracy of the CCD measurement was assessed by comparing it with microscopic measurement. Replicas of 15 premolars restored with Class II ceramic inlays at the time of placement and eight years after restoration were used for morphological measurement by means of the CCD laser scanner system. Occlusal surfaces of the restored teeth were scanned and cross-sections of marginal areas were computed with software. Marginal change was defined as the area enclosed by two profiles obtained by superimposing two cross-sections of the same location at two different times and expressing the maximum depth and mean area of the area enclosed. The accuracy of this method of measurement was 4.3 +/- 3.2 microm in distance and 2.0 +/- 0.6% in area. Quantitative marginal changes for the eight-year period were 10 x 10 microm in depth and 50 x 10(3) microm2 in area at the functional cusp area and 7 x 10 microm in depth and 28 x 10(3) microm2 in area at the non-functional cusp area. Marginal disintegration at the functional cusp area was significantly greater than at the non-functional cusp area (Wilcoxon signed-ranks test, p < 0.05). This study constitutes a quantitative measurement of in vivo deterioration in marginal adaptation of ceramic inlays and indicates that occlusal force may accelerate marginal disintegration.
Sediment Flux, East Greenland Margin
1991-09-17
D.. T 0ATE [3. AEORT TYPE AND ý -2-’S .’:2,E.i 09/17/91 Final Oct. . 1988 - Seot.l. 1991 4. TITLE AND SU.3TITLE S. F*.i1CjG . AU • 12..5 Sediment Flux...and s le ,; its ditribution is unlimited. 13. ABSTRACT (Maximum 2CO words) We investigated sediment flux across an ice-dominated, high latitude...investigated an area off the East Greenland margin where the world’s second largest ice sheet still exists and where information on the extent of glaciation on
Maximum margin semi-supervised learning with irrelevant data.
Yang, Haiqin; Huang, Kaizhu; King, Irwin; Lyu, Michael R
2015-10-01
Semi-supervised learning (SSL) is a typical learning paradigms training a model from both labeled and unlabeled data. The traditional SSL models usually assume unlabeled data are relevant to the labeled data, i.e., following the same distributions of the targeted labeled data. In this paper, we address a different, yet formidable scenario in semi-supervised classification, where the unlabeled data may contain irrelevant data to the labeled data. To tackle this problem, we develop a maximum margin model, named tri-class support vector machine (3C-SVM), to utilize the available training data, while seeking a hyperplane for separating the targeted data well. Our 3C-SVM exhibits several characteristics and advantages. First, it does not need any prior knowledge and explicit assumption on the data relatedness. On the contrary, it can relieve the effect of irrelevant unlabeled data based on the logistic principle and maximum entropy principle. That is, 3C-SVM approaches an ideal classifier. This classifier relies heavily on labeled data and is confident on the relevant data lying far away from the decision hyperplane, while maximally ignoring the irrelevant data, which are hardly distinguished. Second, theoretical analysis is provided to prove that in what condition, the irrelevant data can help to seek the hyperplane. Third, 3C-SVM is a generalized model that unifies several popular maximum margin models, including standard SVMs, Semi-supervised SVMs (S(3)VMs), and SVMs learned from the universum (U-SVMs) as its special cases. More importantly, we deploy a concave-convex produce to solve the proposed 3C-SVM, transforming the original mixed integer programming, to a semi-definite programming relaxation, and finally to a sequence of quadratic programming subproblems, which yields the same worst case time complexity as that of S(3)VMs. Finally, we demonstrate the effectiveness and efficiency of our proposed 3C-SVM through systematical experimental comparisons. Copyright © 2015 Elsevier Ltd. All rights reserved.
An effective fuzzy kernel clustering analysis approach for gene expression data.
Sun, Lin; Xu, Jiucheng; Yin, Jiaojiao
2015-01-01
Fuzzy clustering is an important tool for analyzing microarray data. A major problem in applying fuzzy clustering method to microarray gene expression data is the choice of parameters with cluster number and centers. This paper proposes a new approach to fuzzy kernel clustering analysis (FKCA) that identifies desired cluster number and obtains more steady results for gene expression data. First of all, to optimize characteristic differences and estimate optimal cluster number, Gaussian kernel function is introduced to improve spectrum analysis method (SAM). By combining subtractive clustering with max-min distance mean, maximum distance method (MDM) is proposed to determine cluster centers. Then, the corresponding steps of improved SAM (ISAM) and MDM are given respectively, whose superiority and stability are illustrated through performing experimental comparisons on gene expression data. Finally, by introducing ISAM and MDM into FKCA, an effective improved FKCA algorithm is proposed. Experimental results from public gene expression data and UCI database show that the proposed algorithms are feasible for cluster analysis, and the clustering accuracy is higher than the other related clustering algorithms.
Zhong, Qing; Guo, Tiannan; Rechsteiner, Markus; Rüschoff, Jan H.; Rupp, Niels; Fankhauser, Christian; Saba, Karim; Mortezavi, Ashkan; Poyet, Cédric; Hermanns, Thomas; Zhu, Yi; Moch, Holger; Aebersold, Ruedi; Wild, Peter J.
2017-01-01
Microscopy image data of human cancers provide detailed phenotypes of spatially and morphologically intact tissues at single-cell resolution, thus complementing large-scale molecular analyses, e.g., next generation sequencing or proteomic profiling. Here we describe a high-resolution tissue microarray (TMA) image dataset from a cohort of 71 prostate tissue samples, which was hybridized with bright-field dual colour chromogenic and silver in situ hybridization probes for the tumour suppressor gene PTEN. These tissue samples were digitized and supplemented with expert annotations, clinical information, statistical models of PTEN genetic status, and computer source codes. For validation, we constructed an additional TMA dataset for 424 prostate tissues, hybridized with FISH probes for PTEN, and performed survival analysis on a subset of 339 radical prostatectomy specimens with overall, disease-specific and recurrence-free survival (maximum 167 months). For application, we further produced 6,036 image patches derived from two whole slides. Our curated collection of prostate cancer data sets provides reuse potential for both biomedical and computational studies. PMID:28291248
Polymerization Behavior and Polymer Properties of Eosin-Mediated Surface Modification Reactions.
Avens, Heather J; Randle, Thomas James; Bowman, Christopher N
2008-10-17
Surface modification by surface-mediated polymerization necessitates control of the grafted polymer film thicknesses to achieve the desired property changes. Here, a microarray format is used to assess a range of reaction conditions and formulations rapidly in regards to the film thicknesses achieved and the polymerization behavior. Monomer formulations initiated by eosin conjugates with varying concentrations of poly(ethylene glycol) diacrylate (PEGDA), N-methyldiethanolamine (MDEA), and 1-vinyl-2-pyrrolidone (VP) were evaluated. Acrylamide with MDEA or ascorbic acid as a coinitiator was also investigated. The best formulation was found to be 40 wt% acrylamide with MDEA which yielded four to eight fold thicker films (maximum polymer thickness increased from 180 nm to 1420 nm) and generated visible films from 5-fold lower eosin surface densities (2.8 vs. 14 eosins/µm(2)) compared to a corresponding PEGDA formulation. Using a microarray format to assess multiple initiator surface densities enabled facile identification of a monomer formulation that yields the desired polymer properties and polymerization behavior across the requisite range of initiator surface densities.
Polymerization Behavior and Polymer Properties of Eosin-Mediated Surface Modification Reactions
Avens, Heather J.; Randle, Thomas James; Bowman, Christopher N.
2008-01-01
Surface modification by surface-mediated polymerization necessitates control of the grafted polymer film thicknesses to achieve the desired property changes. Here, a microarray format is used to assess a range of reaction conditions and formulations rapidly in regards to the film thicknesses achieved and the polymerization behavior. Monomer formulations initiated by eosin conjugates with varying concentrations of poly(ethylene glycol) diacrylate (PEGDA), N-methyldiethanolamine (MDEA), and 1-vinyl-2-pyrrolidone (VP) were evaluated. Acrylamide with MDEA or ascorbic acid as a coinitiator was also investigated. The best formulation was found to be 40 wt% acrylamide with MDEA which yielded four to eight fold thicker films (maximum polymer thickness increased from 180 nm to 1420 nm) and generated visible films from 5-fold lower eosin surface densities (2.8 vs. 14 eosins/µm2) compared to a corresponding PEGDA formulation. Using a microarray format to assess multiple initiator surface densities enabled facile identification of a monomer formulation that yields the desired polymer properties and polymerization behavior across the requisite range of initiator surface densities. PMID:19838291
Application of the quantum spin glass theory to image restoration.
Inoue, J I
2001-04-01
Quantum fluctuation is introduced into the Markov random-field model for image restoration in the context of a Bayesian approach. We investigate the dependence of the quantum fluctuation on the quality of a black and white image restoration by making use of statistical mechanics. We find that the maximum posterior marginal (MPM) estimate based on the quantum fluctuation gives a fine restoration in comparison with the maximum a posteriori estimate or the thermal fluctuation based MPM estimate.
NASA Astrophysics Data System (ADS)
Eckert, Andreas; Zhang, Weicheng
2016-02-01
The offshore Nile Delta displays sharply contrasting orientations of the maximum horizontal stress, SH, in regions above Messinian evaporites (suprasalt) and regions below Messinian evaporites (subsalt). Published stress orientation data predominantly show margin-normal suprasalt SH orientations but a margin-parallel subsalt SH orientation. While these data sets provide the first major evidence that evaporite sequences can act as mechanical detachment horizons, the cause for the stress orientation contrast remains unclear. In this study, 3D finite element analysis is used to investigate the causes for stress re-orientation based on two different hypotheses. The modeling study evaluates the influence of different likely salt geometries and whether stress reorientations are the result of basal drag forces induced by gravitational gliding or whether they represent localized variations due to mechanical property contrasts. The modeling results show that when salt is present as a continuous layer, gravitational gliding occurs and basal drag forces induced in the suprasalt layers result in the margin-normal principal stress becoming the maximum horizontal stress. With the margin-normal stress increase being confined to the suprasalt layers, the salt acts as a mechanical detachment horizon, resulting in different SH orientations in the suprasalt compared to the subsalt layers. When salt is present as isolated bodies localized stress variations occur due to the mechanical property contrasts imposed by the salt, also resulting in different SH orientations in the suprasalt compared to the subsalt layers. The modeling results provide additional quantitative evidence to confirm the role of evaporite sequences as mechanical detachment horizons.
Importing MAGE-ML format microarray data into BioConductor.
Durinck, Steffen; Allemeersch, Joke; Carey, Vincent J; Moreau, Yves; De Moor, Bart
2004-12-12
The microarray gene expression markup language (MAGE-ML) is a widely used XML (eXtensible Markup Language) standard for describing and exchanging information about microarray experiments. It can describe microarray designs, microarray experiment designs, gene expression data and data analysis results. We describe RMAGEML, a new Bioconductor package that provides a link between cDNA microarray data stored in MAGE-ML format and the Bioconductor framework for preprocessing, visualization and analysis of microarray experiments. http://www.bioconductor.org. Open Source.
Yiu, Sean; Tom, Brian Dm
2017-01-01
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.
Extracting volatility signal using maximum a posteriori estimation
NASA Astrophysics Data System (ADS)
Neto, David
2016-11-01
This paper outlines a methodology to estimate a denoised volatility signal for foreign exchange rates using a hidden Markov model (HMM). For this purpose a maximum a posteriori (MAP) estimation is performed. A double exponential prior is used for the state variable (the log-volatility) in order to allow sharp jumps in realizations and then log-returns marginal distributions with heavy tails. We consider two routes to choose the regularization and we compare our MAP estimate to realized volatility measure for three exchange rates.
Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu
2012-06-08
Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.
A Very Stable High Throughput Taylor Cone-jet in Electrohydrodynamics
Morad, M. R.; Rajabi, A.; Razavi, M.; Sereshkeh, S. R. Pejman
2016-01-01
A stable capillary liquid jet formed by an electric field is an important physical phenomenon for formation of controllable small droplets, power generation and chemical reactions, printing and patterning, and chemical-biological investigations. In electrohydrodynamics, the well-known Taylor cone-jet has a stability margin within a certain range of the liquid flow rate (Q) and the applied voltage (V). Here, we introduce a simple mechanism to greatly extend the Taylor cone-jet stability margin and produce a very high throughput. For an ethanol cone-jet emitting from a simple nozzle, the stability margin is obtained within 1 kV for low flow rates, decaying with flow rate up to 2 ml/h. By installing a hemispherical cap above the nozzle, we demonstrate that the stability margin could increase to 5 kV for low flow rates, decaying to zero for a maximum flow rate of 65 ml/h. The governing borders of stability margins are discussed and obtained for three other liquids: methanol, 1-propanol and 1-butanol. For a gravity-directed nozzle, the produced cone-jet is more stable against perturbations and the axis of the spray remains in the same direction through the whole stability margin, unlike the cone-jet of conventional simple nozzles. PMID:27917956
Marginal and Random Intercepts Models for Longitudinal Binary Data With Examples From Criminology.
Long, Jeffrey D; Loeber, Rolf; Farrington, David P
2009-01-01
Two models for the analysis of longitudinal binary data are discussed: the marginal model and the random intercepts model. In contrast to the linear mixed model (LMM), the two models for binary data are not subsumed under a single hierarchical model. The marginal model provides group-level information whereas the random intercepts model provides individual-level information including information about heterogeneity of growth. It is shown how a type of numerical averaging can be used with the random intercepts model to obtain group-level information, thus approximating individual and marginal aspects of the LMM. The types of inferences associated with each model are illustrated with longitudinal criminal offending data based on N = 506 males followed over a 22-year period. Violent offending indexed by official records and self-report were analyzed, with the marginal model estimated using generalized estimating equations and the random intercepts model estimated using maximum likelihood. The results show that the numerical averaging based on the random intercepts can produce prediction curves almost identical to those obtained directly from the marginal model parameter estimates. The results provide a basis for contrasting the models and the estimation procedures and key features are discussed to aid in selecting a method for empirical analysis.
Understanding Peripheral Bat Populations Using Maximum-Entropy Suitability Modeling
Barnhart, Paul R.; Gillam, Erin H.
2016-01-01
Individuals along the periphery of a species distribution regularly encounter more challenging environmental and climatic conditions than conspecifics near the center of the distribution. Due to these potential constraints, individuals in peripheral margins are expected to change their habitat and behavioral characteristics. Managers typically rely on species distribution maps when developing adequate management practices. However, these range maps are often too simplistic and do not provide adequate information as to what fine-scale biotic and abiotic factors are driving a species occurrence. In the last decade, habitat suitability modelling has become widely used as a substitute for simplistic distribution mapping which allows regional managers the ability to fine-tune management resources. The objectives of this study were to use maximum-entropy modeling to produce habitat suitability models for seven species that have a peripheral margin intersecting the state of North Dakota, according to current IUCN distributions, and determine the vegetative and climatic characteristics driving these models. Mistnetting resulted in the documentation of five species outside the IUCN distribution in North Dakota, indicating that current range maps for North Dakota, and potentially the northern Great Plains, are in need of update. Maximum-entropy modeling showed that temperature and not precipitation were the variables most important for model production. This fine-scale result highlights the importance of habitat suitability modelling as this information cannot be extracted from distribution maps. Our results provide baseline information needed for future research about how and why individuals residing in the peripheral margins of a species’ distribution may show marked differences in habitat use as a result of urban expansion, habitat loss, and climate change compared to more centralized populations. PMID:27935936
Huerta, Mario; Munyi, Marc; Expósito, David; Querol, Enric; Cedano, Juan
2014-06-15
The microarrays performed by scientific teams grow exponentially. These microarray data could be useful for researchers around the world, but unfortunately they are underused. To fully exploit these data, it is necessary (i) to extract these data from a repository of the high-throughput gene expression data like Gene Expression Omnibus (GEO) and (ii) to make the data from different microarrays comparable with tools easy to use for scientists. We have developed these two solutions in our server, implementing a database of microarray marker genes (Marker Genes Data Base). This database contains the marker genes of all GEO microarray datasets and it is updated monthly with the new microarrays from GEO. Thus, researchers can see whether the marker genes of their microarray are marker genes in other microarrays in the database, expanding the analysis of their microarray to the rest of the public microarrays. This solution helps not only to corroborate the conclusions regarding a researcher's microarray but also to identify the phenotype of different subsets of individuals under investigation, to frame the results with microarray experiments from other species, pathologies or tissues, to search for drugs that promote the transition between the studied phenotypes, to detect undesirable side effects of the treatment applied, etc. Thus, the researcher can quickly add relevant information to his/her studies from all of the previous analyses performed in other studies as long as they have been deposited in public repositories. Marker-gene database tool: http://ibb.uab.es/mgdb © The Author 2014. Published by Oxford University Press.
Anisimov, Sergey V; Khavinson, Vladimir Kh; Anisimov, Vladimir N
2004-01-01
Aging is associated with significant alterations in gene expression in numerous organs and tissues. Anti-aging therapy with peptide bioregulators holds much promise for the correction of age-associated changes, making a screening for their molecular targets in tissues an important question of modern gerontology. The synthetic tetrapeptide Cortagen (Ala-Glu-Asp-Pro) was obtained by directed synthesis based on amino acid analysis of natural brain cortex peptide preparation Cortexin. In humans, Cortagen demonstrated a pronounced therapeutic effect upon the structural and functional posttraumatic recovery of peripheral nerve tissue. Importantly, other effects were also observed in cardiovascular and cerebrovascular parameters. Based on these latter observations, we hypothesized that acute course of Cortagen treatment, large-scale transcriptome analysis, and identification of transcripts with altered expression in heart would facilitate our understanding of the mechanisms responsible for this peptide biological effects. We therefore analyzed the expression of 15,247 transcripts in the heart of female 6-months CBA mice receiving injections of Cortagen for 5 consecutive days was studied by cDNA microarrays. Comparative analysis of cDNA microarray hybridisation with heart samples from control and experimental group revealed 234 clones (1,53% of the total number of clones) with significant changes of expression that matched 110 known genes belonging to various functional categories. Maximum up- and down-regulation was +5.42 and -2.86, respectively. Intercomparison of changes in cardiac expression profile induced by synthetic peptides (Cortagen, Vilon, Epitalon) and pineal peptide hormone melatonin revealed both common and specific effects of Cortagen upon gene expression in heart.
Investigation of acceleration characteristics of a single-spool turbojet engine
NASA Technical Reports Server (NTRS)
Oppenheimer, Frank L; Pack, George J
1953-01-01
Operation of a single-spool turbojet engine with constant exhaust-nozzle area was investigated at one flight condition. Data were obtained by subjecting the engine to approximate-step changes in fuel flow, and the information necessary to show the relations of acceleration to the sensed engine variables was obtained. These data show that maximum acceleration occurred prior to stall and surge. In the low end of the engine-speed range the margin was appreciable; in the high-speed end the margin was smaller but had not been completely defined by these data. Data involving acceleration as a function of speed, fuel flow, turbine-discharge temperature, compressor-discharge pressure, and thrust have been presented and an effort has been made to show how a basic control system could be improved by addition of an override in which the acceleration characteristic is used not only to prevent the engine from entering the surge region but also to obtain acceleration along the maximum acceleration line during throttle bursts.
NASA Astrophysics Data System (ADS)
Sluijs, A.; van Roij, L.; Harrington, G. J.; Schouten, S.; Sessa, J. A.; LeVay, L. J.; Reichart, G.-J.; Slomp, C. P.
2013-12-01
The Paleocene/Eocene Thermal Maximum (PETM, ~56 Ma) was a ~200 kyr episode of global warming, associated with massive injections of 13C-depleted carbon into the ocean-atmosphere system. Although climate change during the PETM is relatively well constrained, effects on marine oxygen and nutrient cycling remain largely unclear. We identify the PETM in a sediment core from the US margin of the Gulf of Mexico. Biomarker-based paleotemperature proxies (MBT/CBT and TEX86) indicate that continental air and sea surface temperatures warmed from 27-29 °C to ~35 °C, although variations in the relative abundances of terrestrial and marine biomarkers may have influenced the record. Vegetation changes as recorded from pollen assemblages supports profound warming. Lithology, relative abundances of terrestrial vs. marine palynomorphs as well as dinoflagellate cyst and biomarker assemblages indicate sea level rise during the PETM, consistent with previously recognized eustatic rise. The recognition of a maximum flooding surface during the PETM changes regional sequence stratigraphic interpretations, which allows us to exclude the previously posed hypothesis that a nearby fossil found in PETM-deposits represents the first North American primate. Within the PETM we record the biomarker isorenieratane, diagnostic of euxinic photic zone conditions. A global data compilation indicates that deoxygenation occurred in large regions of the global ocean in response to warming, hydrological change, and carbon cycle feedbacks, particularly along continental margins, analogous to modern trends. Seafloor deoxygenation and widespread anoxia likely caused phosphorus regeneration from suboxic and anoxic sediments. We argue that this fuelled shelf eutrophication, as widely recorded from microfossil studies, increasing organic carbon burial along continental margins as a negative feedback to carbon input and global warming. If properly quantified with future work, the PETM offers the opportunity to assess the biogeochemical effects of enhanced phosphorus regeneration, as well as the time-scales on which this feedback operates in view of modern and future ocean deoxygenation.
2008 Microarray Research Group (MARG Survey): Sensing the State of Microarray Technology
Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution and transformation, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. Th...
THE ABRF-MARG MICROARRAY SURVEY 2004: TAKING THE PULSE OF THE MICROARRAY FIELD
Over the past several years, the field of microarrays has grown and evolved drastically. In its continued efforts to track this evolution, the ABRF-MARG has once again conducted a survey of international microarray facilities and individual microarray users. The goal of the surve...
Contributions to Statistical Problems Related to Microarray Data
ERIC Educational Resources Information Center
Hong, Feng
2009-01-01
Microarray is a high throughput technology to measure the gene expression. Analysis of microarray data brings many interesting and challenging problems. This thesis consists three studies related to microarray data. First, we propose a Bayesian model for microarray data and use Bayes Factors to identify differentially expressed genes. Second, we…
NASA Astrophysics Data System (ADS)
Bogdanov, Valery L.; Boyce-Jacino, Michael
1999-05-01
Confined arrays of biochemical probes deposited on a solid support surface (analytical microarray or 'chip') provide an opportunity to analysis multiple reactions simultaneously. Microarrays are increasingly used in genetics, medicine and environment scanning as research and analytical instruments. A power of microarray technology comes from its parallelism which grows with array miniaturization, minimization of reagent volume per reaction site and reaction multiplexing. An optical detector of microarray signals should combine high sensitivity, spatial and spectral resolution. Additionally, low-cost and a high processing rate are needed to transfer microarray technology into biomedical practice. We designed an imager that provides confocal and complete spectrum detection of entire fluorescently-labeled microarray in parallel. Imager uses microlens array, non-slit spectral decomposer, and high- sensitive detector (cooled CCD). Two imaging channels provide a simultaneous detection of localization, integrated and spectral intensities for each reaction site in microarray. A dimensional matching between microarray and imager's optics eliminates all in moving parts in instrumentation, enabling highly informative, fast and low-cost microarray detection. We report theory of confocal hyperspectral imaging with microlenses array and experimental data for implementation of developed imager to detect fluorescently labeled microarray with a density approximately 103 sites per cm2.
NASA Technical Reports Server (NTRS)
Gagliano, J. A.; Mcsheehy, J. J.; Cavalieri, D. J.
1983-01-01
An airborne imaging 92/183 GHz radiometer was recently flown onboard NASA's Convair 990 research aircraft during the February 1983 Bering Sea Marginal Ice Zone Experiment (MIZEX-WEST). The 92 GHz portion of the radiometer was used to gather ice signature data and to generate real-time millimeter wave images of the marginal ice zone. Dry atmospheric conditions in the Arctic resulted in good surface ice signature data for the 183 GHz double sideband (DSB) channel situated + or - 8.75 GHz away from the water vapor absorption line. The radiometer's beam scanner imaged the marginal ice zone over a + or - 45 degrees swath angle about the aircraft nadir position. The aircraft altitude was 30,000 feet (9.20 km) maximum and 3,000 feet (0.92 km) minimum during the various data runs. Calculations of the minimum detectable target (ice) size for the radiometer as a function of aircraft altitude were performed. In addition, the change in the atmospheric attenuation at 92 GHz under varying weather conditions was incorporated into the target size calculations. A radiometric image of surface ice at 92 GHz in the marginal ice zone is included.
Thermal margin protection system for a nuclear reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Musick, C.R.
1974-02-12
A thermal margin protection system for a nuclear reactor is described where the coolant flow flow trip point and the calculated thermal margin trip point are switched simultaneously and the thermal limit locus is made more restrictive as the allowable flow rate is decreased. The invention is characterized by calculation of the thermal limit Locus in response to applied signals which accurately represent reactor cold leg temperature and core power; cold leg temperature being corrected for stratification before being utilized and reactor power signals commensurate with power as a function of measured neutron flux and thermal energy added to themore » coolant being auctioneered to select the more conservative measure of power. The invention further comprises the compensation of the selected core power signal for the effects of core radial peaking factor under maximum coolant flow conditions. (Official Oazette)« less
Teke, Memik; Teke, Fatma; Alan, Bircan; Türkoğlu, Ahmet; Hamidi, Cihad; Göya, Cemil; Hattapoğlu, Salih; Gumus, Metehan
2017-01-01
Differentiation of idiopathic granulomatous mastitis (IGM) from carcinoma with routine imaging methods, such as ultrasonography (US) and mammography, is difficult. Therefore, we evaluated the value of a newly developed noninvasive technique called acoustic radiation force impulse imaging in differentiating IGM versus malignant lesions in the breast. Four hundred and eighty-six patients, who were referred to us with a presumptive diagnosis of a mass, underwent Virtual Touch tissue imaging (VTI; Siemens) and Virtual Touch tissue quantification (VTQ; Siemens) after conventional gray-scale US. US-guided percutaneous needle biopsy was then performed on 276 lesions with clinically and radiologically suspicious features. Malignant lesions (n = 122) and IGM (n = 48) were included in the final study group. There was a statistically significant difference in shear wave velocity marginal and internal values between the IGM and malignant lesions. The median marginal velocity for IGM and malignant lesions was 3.19 m/s (minimum-maximum 2.49-5.82) and 5.05 m/s (minimum-maximum 2.09-8.46), respectively (p < 0.001). The median internal velocity for IGM and malignant lesions was 2.76 m/s (minimum-maximum 1.14-4.12) and 4.79 m/s (minimum-maximum 2.12-8.02), respectively (p < 0.001). The combination of VTI and VTQ as a complement to conventional US provides viscoelastic properties of tissues, and thus has the potential to increase the specificity of US.
Chemiluminescence microarrays in analytical chemistry: a critical review.
Seidel, Michael; Niessner, Reinhard
2014-09-01
Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.
Code of Federal Regulations, 2014 CFR
2014-01-01
... General. (a) For oil systems and components that have been approved under the engine airworthiness...) Each engine must have an independent oil system that can supply it with an appropriate quantity of oil... the maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure...
Code of Federal Regulations, 2011 CFR
2011-01-01
... General. (a) For oil systems and components that have been approved under the engine airworthiness...) Each engine must have an independent oil system that can supply it with an appropriate quantity of oil... the maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure...
Code of Federal Regulations, 2012 CFR
2012-01-01
... General. (a) For oil systems and components that have been approved under the engine airworthiness...) Each engine must have an independent oil system that can supply it with an appropriate quantity of oil... the maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure...
Code of Federal Regulations, 2013 CFR
2013-01-01
... General. (a) For oil systems and components that have been approved under the engine airworthiness...) Each engine must have an independent oil system that can supply it with an appropriate quantity of oil... the maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure...
Code of Federal Regulations, 2010 CFR
2010-01-01
... General. (a) For oil systems and components that have been approved under the engine airworthiness...) Each engine must have an independent oil system that can supply it with an appropriate quantity of oil... the maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure...
Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model
ERIC Educational Resources Information Center
Lamsal, Sunil
2015-01-01
Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
Phillips, P.J.; Shedlock, R.J.
1993-01-01
The hydrochemistry of small seasonal ponds was investigated by studying relations between ground-water and surface water in a forested Coastal Plain drainage basin. Observation of changes in the water table in a series of wells equipped with automatic water-level recorders showed that the relation between water-table configuration and basin topography changes seasonally, and particularly in response to spring recharge. Furthermore, in this study area the water table is not a subdued expression of the land surface topography, as is commonly assumed. During the summer and fall months, a water-table trough underlies sandy ridges separating the seasonal ponds, and maximum water-table altitudes prevail in the sediments beneath the dry pond bottoms. As the ponds fill with water during the winter, maximum water-table altitudes shift to the upland-margin zone adjacent to the seasonal ponds. Increases in pond stage are associated with the development of transient water-table mounds at the upland-margin wells during the spring. The importance of small local-flow systems adjacent to the seasonal ponds also is shown by the similarities in the chemistry of the shallow groundwater in the upland margin and water in the seasonal ponds. The upland margin and surface water samples have low pH (generally less than 5.0), and contain large concentrations of dissolved aluminum (generally more than 100 ??g 1-1), and low bicarbonate concentrations (2 mg l4 or less). In contrast, the parts of the surficial aquifer that do not experience transient mounding have higher pH and larger concentrations of bicarbonate. These results suggest that an understanding of the hydrochemistry of seasonally ponded wetlands requires intensive study of the adjacent shallow groundwater-flow system. ?? 1993.
NASA Astrophysics Data System (ADS)
Tierney, J.; Cleaveland, L.; Herbert, T.; Altabet, M.
2004-12-01
The Peru Margin upwelling zone plays a key role in regulating marine biogeochemical cycles, particularly the fate of nitrate. High biological productivity and low oxygen waters fed into the oxygen minimum zone result in intense denitrification in the modern system, the consequences of which are global in nature. It has been very difficult, however, to study the paleoclimatic history of this region because of the poor preservation of carbonate in Peru Margin sediments. Here we present records of trace metal accumulation from two cores located in the heart of the suboxic zone off the central Peru coast. Chronology comes from multiple AMS 14C dates on the alkenone fraction of the sediment, as well as correlation using major features of the \\delta 15N record in each core. ODP Site 1228 provides a high resolution, continuous sediment record from the Recent to about 14ka, while gravity core W7706-41k extends the record to the Last Glacial Maximum. Both cores were sampled at a 100 yr resolution, then analyzed for % N, \\delta 15N, alkenones, and trace metal concentration. Analysis of redox-sensitive metals (Mo and V) alongside metals associated with changes in productivity (Ni and Zn) provides perspective on the evolution of the upwelling system and distinguishes the two major factors controlling the intensity of the oxygen minimum zone. The trace metal record exhibits a notable increase in the intensity and variability of low oxygen waters and productivity beginning around 6ka and extending to the present. Within this most recent 6ka interval, the data suggest fluctuations in oxygenation and productivity occur on 1000 yr timescales. Our core records, therefore, suggest that the Peru Margin upwelling system strengthened significantly during the mid to late Holocene.
Max-margin weight learning for medical knowledge network.
Jiang, Jingchi; Xie, Jing; Zhao, Chao; Su, Jia; Guan, Yi; Yu, Qiubin
2018-03-01
The application of medical knowledge strongly affects the performance of intelligent diagnosis, and method of learning the weights of medical knowledge plays a substantial role in probabilistic graphical models (PGMs). The purpose of this study is to investigate a discriminative weight-learning method based on a medical knowledge network (MKN). We propose a training model called the maximum margin medical knowledge network (M 3 KN), which is strictly derived for calculating the weight of medical knowledge. Using the definition of a reasonable margin, the weight learning can be transformed into a margin optimization problem. To solve the optimization problem, we adopt a sequential minimal optimization (SMO) algorithm and the clique property of a Markov network. Ultimately, M 3 KN not only incorporates the inference ability of PGMs but also deals with high-dimensional logic knowledge. The experimental results indicate that M 3 KN obtains a higher F-measure score than the maximum likelihood learning algorithm of MKN for both Chinese Electronic Medical Records (CEMRs) and Blood Examination Records (BERs). Furthermore, the proposed approach is obviously superior to some classical machine learning algorithms for medical diagnosis. To adequately manifest the importance of domain knowledge, we numerically verify that the diagnostic accuracy of M 3 KN is gradually improved as the number of learned CEMRs increase, which contain important medical knowledge. Our experimental results show that the proposed method performs reliably for learning the weights of medical knowledge. M 3 KN outperforms other existing methods by achieving an F-measure of 0.731 for CEMRs and 0.4538 for BERs. This further illustrates that M 3 KN can facilitate the investigations of intelligent healthcare. Copyright © 2018 Elsevier B.V. All rights reserved.
Dynamical resonance shift and unification of resonances in short-pulse laser-cluster interaction
NASA Astrophysics Data System (ADS)
Mahalik, S. S.; Kundu, M.
2018-06-01
Pronounced maximum absorption of laser light irradiating a rare-gas or metal cluster is widely expected during the linear resonance (LR) when Mie-plasma wavelength λM of electrons equals the laser wavelength λ . On the contrary, by performing molecular dynamics (MD) simulations of an argon cluster irradiated by short 5-fs (FWHM) laser pulses it is revealed that, for a given laser pulse energy and a cluster, at each peak intensity there exists a λ —shifted from the expected λM—that corresponds to a unified dynamical LR at which evolution of the cluster happens through very efficient unification of possible resonances in various stages, including (i) the LR in the initial time of plasma creation, (ii) the LR in the Coulomb expanding phase in the later time, and (iii) anharmonic resonance in the marginally overdense regime for a relatively longer pulse duration, leading to maximum laser absorption accompanied by maximum removal of electrons from cluster and also maximum allowed average charge states for the argon cluster. Increasing the laser intensity, the absorption maxima is found to shift to a higher wavelength in the band of λ ≈(1 -1.5 ) λM than permanently staying at the expected λM. A naive rigid sphere model also corroborates the wavelength shift of the absorption peak as found in MD and unequivocally proves that maximum laser absorption in a cluster happens at a shifted λ in the marginally overdense regime of λ ≈(1 -1.5 ) λM instead of λM of LR. The present study is important for guiding an optimal condition laser-cluster interaction experiment in the short-pulse regime.
DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K
2012-04-05
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.
Microleakage of Four Dental Cements in Metal Ceramic Restorations With Open Margins.
Eftekhar Ashtiani, Reza; Farzaneh, Babak; Azarsina, Mohadese; Aghdashi, Farzad; Dehghani, Nima; Afshari, Aisooda; Mahshid, Minu
2015-11-01
Fixed prosthodontics is a routine dental treatment and microleakage is a major cause of its failure. The aim of this study was to assess the marginal microleakage of four cements in metal ceramic restorations with adapted and open margins. Sixty sound human premolars were selected for this experimental study performed in Tehran, Iran and prepared for full-crown restorations. Wax patterns were formed leaving a 300 µm gap on one of the proximal margins. The crowns were cast and the samples were randomly divided into four groups based on the cement used. Copings were cemented using zinc phosphate cement (Fleck), Fuji Plus resin-modified glass ionomer, Panavia F2.0 resin cement, or G-Cem resin cement, according to the manufacturers' instructions. Samples were immersed in 2% methylene blue solution. After 24 hours, dye penetration was assessed under a stereomicroscope and analyzed using the respective software. Data were analyzed using ANOVA, paired t-tests, and Kruskal-Wallis, Wilcoxon, and Mann-Whitney tests. The least microleakage occurred in the Panavia F2.0 group (closed margin, 0.18 mm; open margin, 0.64 mm) and the maximum was observed in the Fleck group (closed margin, 1.92 mm; open margin, 3.32 mm). The Fleck group displayed significantly more microleakage compared to the Fuji Plus and Panavia F2.0 groups (P < 0.001) in both closed and open margins. In open margins, differences in microleakage between the Fuji Plus and G-Cem as well as between the G-Cem and Panavia F2.0 groups were significant (P < 0.001). In closed margins, only the G-Cem group displayed significantly more microleakage as compared to the Panavia F2.0 group (P < 0.05). Paired t-test results showed significantly more microleakage in open margins compared to closed margins, except in the Fuji Plus group (P = 0.539). Fuji Plus cement exhibited better sealing ability in closed and open margins compared to G-Cem and Fleck cements. When using G-Cem and Fleck cements for full metal ceramic restorations, clinicians should try to minimize marginal gaps in order to reduce restoration failure. In situations where there are doubts about perfect marginal adaptation, the use of Fuji Plus cement may be helpful.
In vitro marginal fit of three all-ceramic crown systems.
Yeo, In-Sung; Yang, Jae-Ho; Lee, Jai-Bong
2003-11-01
Studies on marginal discrepancies of single restorations using various systems and materials have resulted in statistical inferences that are ambiguous because of small sample sizes and limited numbers of measurements per specimen. The purpose of this study was to compare the marginal adaptation of single anterior restorations made using different systems. The in vitro marginal discrepancies of 3 different all-ceramic crown systems (Celay In-Ceram, conventional In-Ceram, and IPS Empress 2 layering technique), and a control group of metal ceramic restorations were evaluated and compared by measuring the gap dimension between the crowns and the prepared tooth at the marginal opening. The crowns were made for 1 extracted maxillary central incisor prepared with a 1-mm shoulder margin and 6-degree tapered walls by milling. Thirty crowns per system were fabricated. Crown measurements were recorded with an optical microscope, with an accuracy of +/-0.1 microm, at 50 points spaced approximately 400 microm along the circumferential margin. The criterion of 120 microm was used as the maximum clinically acceptable marginal gap. Mean gap dimensions and standard deviations were calculated for marginal opening. The data were analyzed with a 1-way analysis of variance (alpha=.05). Mean gap dimensions and standard deviations at the marginal opening for the incisor crowns were 87 +/- 34 microm for control, 83 +/- 33 microm for Celay In-Ceram, 112 +/- 55 microm for conventional In-Ceram, and 46 +/- 16 microm for the IPS Empress 2 layering technique. Significant differences were found among the crown groups (P<.05). Compared with the control group, the IPS Empress 2 group had significantly smaller marginal discrepancies (P<.05), and the conventional In-Ceram group exhibited significantly greater marginal discrepancies (P<.05). There was no significant difference between the Celay In-Ceram and the control group. Within the limitations of this study, the marginal discrepancies were all within the clinically acceptable standard set at 120 microm. However, the IPS Empress 2 system showed the smallest and most homogeneous gap dimension, whereas the conventional In-Ceram system presented the largest and more variable gap dimension compared with the metal ceramic (control) restoration.
14 CFR 29.1011 - Engines: general.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Engines: general. 29.1011 Section 29.1011... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant Oil System § 29.1011 Engines: general. (a) Each engine... the maximum allowable oil consumption of the engine under the same conditions, plus a suitable margin...
14 CFR 27.1011 - Engines: General.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Engines: General. 27.1011 Section 27.1011... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Oil System § 27.1011 Engines: General. (a) Each engine must... maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure adequate...
Estimating the Parameters of the Beta-Binomial Distribution.
ERIC Educational Resources Information Center
Wilcox, Rand R.
1979-01-01
For some situations the beta-binomial distribution might be used to describe the marginal distribution of test scores for a particular population of examinees. Several different methods of approximating the maximum likelihood estimate were investigated, and it was found that the Newton-Raphson method should be used when it yields admissable…
ERIC Educational Resources Information Center
Prevost, A. Toby; Mason, Dan; Griffin, Simon; Kinmonth, Ann-Louise; Sutton, Stephen; Spiegelhalter, David
2007-01-01
Practical meta-analysis of correlation matrices generally ignores covariances (and hence correlations) between correlation estimates. The authors consider various methods for allowing for covariances, including generalized least squares, maximum marginal likelihood, and Bayesian approaches, illustrated using a 6-dimensional response in a series of…
Estimation Methods for One-Parameter Testlet Models
ERIC Educational Resources Information Center
Jiao, Hong; Wang, Shudong; He, Wei
2013-01-01
This study demonstrated the equivalence between the Rasch testlet model and the three-level one-parameter testlet model and explored the Markov Chain Monte Carlo (MCMC) method for model parameter estimation in WINBUGS. The estimation accuracy from the MCMC method was compared with those from the marginalized maximum likelihood estimation (MMLE)…
14 CFR 27.1011 - Engines: General.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Engines: General. 27.1011 Section 27.1011... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Oil System § 27.1011 Engines: General. (a) Each engine must... maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure adequate...
14 CFR 27.1011 - Engines: General.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Engines: General. 27.1011 Section 27.1011... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Oil System § 27.1011 Engines: General. (a) Each engine must... maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure adequate...
14 CFR 27.1011 - Engines: General.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Engines: General. 27.1011 Section 27.1011... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Oil System § 27.1011 Engines: General. (a) Each engine must... maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure adequate...
14 CFR 29.1011 - Engines: general.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Engines: general. 29.1011 Section 29.1011... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant Oil System § 29.1011 Engines: general. (a) Each engine... the maximum allowable oil consumption of the engine under the same conditions, plus a suitable margin...
14 CFR 29.1011 - Engines: general.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Engines: general. 29.1011 Section 29.1011... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant Oil System § 29.1011 Engines: general. (a) Each engine... the maximum allowable oil consumption of the engine under the same conditions, plus a suitable margin...
14 CFR 27.1011 - Engines: General.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Engines: General. 27.1011 Section 27.1011... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Oil System § 27.1011 Engines: General. (a) Each engine must... maximum oil consumption of the engine under the same conditions, plus a suitable margin to ensure adequate...
14 CFR 29.1011 - Engines: general.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Engines: general. 29.1011 Section 29.1011... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant Oil System § 29.1011 Engines: general. (a) Each engine... the maximum allowable oil consumption of the engine under the same conditions, plus a suitable margin...
14 CFR 29.1011 - Engines: general.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Engines: general. 29.1011 Section 29.1011... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant Oil System § 29.1011 Engines: general. (a) Each engine... the maximum allowable oil consumption of the engine under the same conditions, plus a suitable margin...
The CO2 laser frequency stability measurements
NASA Technical Reports Server (NTRS)
Johnson, E. H., Jr.
1973-01-01
Carbon dioxide laser frequency stability data are considered for a receiver design that relates to maximum Doppler frequency and its rate of change. Results show that an adequate margin exists in terms of data acquisition, Doppler tracking, and bit error rate as they relate to laser stability and transmitter power.
Intra-Platform Repeatability and Inter-Platform Comparability of MicroRNA Microarray Technology
Sato, Fumiaki; Tsuchiya, Soken; Terasawa, Kazuya; Tsujimoto, Gozoh
2009-01-01
Over the last decade, DNA microarray technology has provided a great contribution to the life sciences. The MicroArray Quality Control (MAQC) project demonstrated the way to analyze the expression microarray. Recently, microarray technology has been utilized to analyze a comprehensive microRNA expression profiling. Currently, several platforms of microRNA microarray chips are commercially available. Thus, we compared repeatability and comparability of five different microRNA microarray platforms (Agilent, Ambion, Exiqon, Invitrogen and Toray) using 309 microRNAs probes, and the Taqman microRNA system using 142 microRNA probes. This study demonstrated that microRNA microarray has high intra-platform repeatability and comparability to quantitative RT-PCR of microRNA. Among the five platforms, Agilent and Toray array showed relatively better performances than the others. However, the current lineup of commercially available microRNA microarray systems fails to show good inter-platform concordance, probably because of lack of an adequate normalization method and severe divergence in stringency of detection call criteria between different platforms. This study provided the basic information about the performance and the problems specific to the current microRNA microarray systems. PMID:19436744
Living Cell Microarrays: An Overview of Concepts
Jonczyk, Rebecca; Kurth, Tracy; Lavrentieva, Antonina; Walter, Johanna-Gabriela; Scheper, Thomas; Stahl, Frank
2016-01-01
Living cell microarrays are a highly efficient cellular screening system. Due to the low number of cells required per spot, cell microarrays enable the use of primary and stem cells and provide resolution close to the single-cell level. Apart from a variety of conventional static designs, microfluidic microarray systems have also been established. An alternative format is a microarray consisting of three-dimensional cell constructs ranging from cell spheroids to cells encapsulated in hydrogel. These systems provide an in vivo-like microenvironment and are preferably used for the investigation of cellular physiology, cytotoxicity, and drug screening. Thus, many different high-tech microarray platforms are currently available. Disadvantages of many systems include their high cost, the requirement of specialized equipment for their manufacture, and the poor comparability of results between different platforms. In this article, we provide an overview of static, microfluidic, and 3D cell microarrays. In addition, we describe a simple method for the printing of living cell microarrays on modified microscope glass slides using standard DNA microarray equipment available in most laboratories. Applications in research and diagnostics are discussed, e.g., the selective and sensitive detection of biomarkers. Finally, we highlight current limitations and the future prospects of living cell microarrays. PMID:27600077
Microarray analysis of miRNA expression profiles following whole body irradiation in a mouse model.
Aryankalayil, Molykutty J; Chopra, Sunita; Makinde, Adeola; Eke, Iris; Levin, Joel; Shankavaram, Uma; MacMillan, Laurel; Vanpouille-Box, Claire; Demaria, Sandra; Coleman, C Norman
2018-06-19
Accidental exposure to life-threatening radiation in a nuclear event is a major concern; there is an enormous need for identifying biomarkers for radiation biodosimetry to triage populations and treat critically exposed individuals. To identify dose-differentiating miRNA signatures from whole blood samples of whole body irradiated mice. Mice were whole body irradiated with X-rays (2 Gy-15 Gy); blood was collected at various time-points post-exposure; total RNA was isolated; miRNA microarrays were performed; miRNAs differentially expressed in irradiated vs. unirradiated controls were identified; feature extraction and classification models were applied to predict dose-differentiating miRNA signature. We observed a time and dose responsive alteration in the expression levels of miRNAs. Maximum number of miRNAs were altered at 24-h and 48-h time-points post-irradiation. A 23-miRNA signature was identified using feature selection algorithms and classifier models. An inverse correlation in the expression level changes of miR-17 members, and their targets were observed in whole body irradiated mice and non-human primates. Whole blood-based miRNA expression signatures might be used for predicting radiation exposures in a mass casualty nuclear incident.
Clevert, Djork-Arné; Mitterecker, Andreas; Mayr, Andreas; Klambauer, Günter; Tuefferd, Marianne; De Bondt, An; Talloen, Willem; Göhlmann, Hinrich; Hochreiter, Sepp
2011-07-01
Cost-effective oligonucleotide genotyping arrays like the Affymetrix SNP 6.0 are still the predominant technique to measure DNA copy number variations (CNVs). However, CNV detection methods for microarrays overestimate both the number and the size of CNV regions and, consequently, suffer from a high false discovery rate (FDR). A high FDR means that many CNVs are wrongly detected and therefore not associated with a disease in a clinical study, though correction for multiple testing takes them into account and thereby decreases the study's discovery power. For controlling the FDR, we propose a probabilistic latent variable model, 'cn.FARMS', which is optimized by a Bayesian maximum a posteriori approach. cn.FARMS controls the FDR through the information gain of the posterior over the prior. The prior represents the null hypothesis of copy number 2 for all samples from which the posterior can only deviate by strong and consistent signals in the data. On HapMap data, cn.FARMS clearly outperformed the two most prevalent methods with respect to sensitivity and FDR. The software cn.FARMS is publicly available as a R package at http://www.bioinf.jku.at/software/cnfarms/cnfarms.html.
Optimal moment determination in POME-copula based hydrometeorological dependence modelling
NASA Astrophysics Data System (ADS)
Liu, Dengfeng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi
2017-07-01
Copula has been commonly applied in multivariate modelling in various fields where marginal distribution inference is a key element. To develop a flexible, unbiased mathematical inference framework in hydrometeorological multivariate applications, the principle of maximum entropy (POME) is being increasingly coupled with copula. However, in previous POME-based studies, determination of optimal moment constraints has generally not been considered. The main contribution of this study is the determination of optimal moments for POME for developing a coupled optimal moment-POME-copula framework to model hydrometeorological multivariate events. In this framework, margins (marginals, or marginal distributions) are derived with the use of POME, subject to optimal moment constraints. Then, various candidate copulas are constructed according to the derived margins, and finally the most probable one is determined, based on goodness-of-fit statistics. This optimal moment-POME-copula framework is applied to model the dependence patterns of three types of hydrometeorological events: (i) single-site streamflow-water level; (ii) multi-site streamflow; and (iii) multi-site precipitation, with data collected from Yichang and Hankou in the Yangtze River basin, China. Results indicate that the optimal-moment POME is more accurate in margin fitting and the corresponding copulas reflect a good statistical performance in correlation simulation. Also, the derived copulas, capturing more patterns which traditional correlation coefficients cannot reflect, provide an efficient way in other applied scenarios concerning hydrometeorological multivariate modelling.
Design and analytical study of a rotor airfoil
NASA Technical Reports Server (NTRS)
Dadone, L. U.
1978-01-01
An airfoil section for use on helicopter rotor blades was defined and analyzed by means of potential flow/boundary layer interaction and viscous transonic flow methods to meet as closely as possible a set of advanced airfoil design objectives. The design efforts showed that the first priority objectives, including selected low speed pitching moment, maximum lift and drag divergence requirements can be met, though marginally. The maximum lift requirement at M = 0.5 and most of the profile drag objectives cannot be met without some compromise of at least one of the higher order priorities.
ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.
ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.
Marginalized zero-inflated negative binomial regression with application to dental caries
Preisser, John S.; Das, Kalyan; Long, D. Leann; Divaris, Kimon
2015-01-01
The zero-inflated negative binomial regression model (ZINB) is often employed in diverse fields such as dentistry, health care utilization, highway safety, and medicine to examine relationships between exposures of interest and overdispersed count outcomes exhibiting many zeros. The regression coefficients of ZINB have latent class interpretations for a susceptible subpopulation at risk for the disease/condition under study with counts generated from a negative binomial distribution and for a non-susceptible subpopulation that provides only zero counts. The ZINB parameters, however, are not well-suited for estimating overall exposure effects, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. In this paper, a marginalized zero-inflated negative binomial regression (MZINB) model for independent responses is proposed to model the population marginal mean count directly, providing straightforward inference for overall exposure effects based on maximum likelihood estimation. Through simulation studies, the finite sample performance of MZINB is compared to marginalized zero-inflated Poisson, Poisson, and negative binomial regression. The MZINB model is applied in the evaluation of a school-based fluoride mouthrinse program on dental caries in 677 children. PMID:26568034
Study on casing treatment and stator matching on multistage fan
NASA Astrophysics Data System (ADS)
Wu, Chuangliang; Yuan, Wei; Deng, Zhe
2017-10-01
Casing treatments are required for expanding the stall margin of multi-stage high-load turbofans designed with high blade-tip Mach numbers and high leakage flow. In the case of a low mass flow, the casing treatment effectively reduces the blockages caused by the leakage flow and enlarges the stall margin. However, in the case of a high mass flow, the casing treatment affects the overall flow capacity of the fan, the thrust when operating at the high speeds usually required by design-point specifications. Herein, we study a two-stage high-load fan with three-dimensional numerical simulations. We use the simulation results to propose a scheme that enlarges the stall margin of multistage high-load fans without sacrificing the flow capacity when operating with a large mass flow. Furthermore, a circumferential groove casing treatment is used and adjustments are made to the upstream stator angle to match the casing treatment. The stall margin is thus increased to 16.3%, with no reduction in the maximum mass flow rate or the design thrust performance.
Patankar, Anuya; Kheur, Mohit; Kheur, Supriya; Lakha, Tabrez; Burhanpurwala, Murtuza
2016-12-01
This in vitro study evaluated the effect of different levels of preparation of an implant abutment on its fracture resistance. The study evaluated abutments that incorporated a platform switch (Myriad Plus Abutments, Morse Taper Connection) and Standard abutments (BioHorizons Standard Abutment, BioHorizons Inc). Each abutment was connected to an appropriate implant and mounted in a self-cured resin base. Based on the abutment preparation depths, 3 groups were created for each abutment type: as manufactured, abutment prepared 1 mm apical to the original margin, and abutment prepared 1.5 mm to the original margin. All the abutments were prepared in a standardized manner to incorporate a 0.5 mm chamfer margin uniformly. All the abutments were torqued to 30 Ncm on their respective implants. They were then subjected to loading until failure in a universal testing machine. Abutments with no preparation showed the maximum resistance to fracture for both groups. As the preparation depth increased, the fracture resistance decreased. The fracture resistance of implant abutment junction decreases as the preparation depth increases.
Welter, S; Stöcker, C; Dicken, V; Kühl, H; Krass, S; Stamatis, G
2012-03-01
Segmental resection in stage I non-small cell lung cancer (NSCLC) has been well described and is considered to have similar survival rates as lobectomy but with increased rates of local tumour recurrence due to inadequate parenchymal margins. In consequence, today segmentectomy is only performed when the tumour is smaller than 2 cm. Three-dimensional reconstructions from 11 thin-slice CT scans of bronchopulmonary segments were generated, and virtual spherical tumours were placed over the segments, respecting all segmental borders. As a next step, virtual parenchymal safety margins of 2 cm and 3 cm were subtracted and the size of the remaining tumour calculated. The maximum tumour diameters with a 30-mm parenchymal safety margin ranged from 26.1 mm in right-sided segments 7 + 8 to 59.8 mm in the left apical segments 1-3. Using a three-dimensional reconstruction of lung CT scans, we demonstrated that segmentectomy or resection of segmental groups should be feasible with adequate margins, even for larger tumours in selected cases. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
The Northern Appalachian Anomaly: A modern asthenospheric upwelling
NASA Astrophysics Data System (ADS)
Menke, William; Skryzalin, Peter; Levin, Vadim; Harper, Thomas; Darbyshire, Fiona; Dong, Ted
2016-10-01
The Northern Appalachian Anomaly (NAA) is an intense, laterally localized (400 km diameter) low-velocity anomaly centered in the asthenosphere beneath southern New England. Its maximum shear velocity contrast, at 200 km depth, is about 10%, and its compressional-to-shear velocity perturbation ratio is about unity, values compatible with it being a modern thermal anomaly. Although centered close to the track of the Great Meteor hot spot, it is not elongated parallel to it and does not crosscut the cratonic margin. In contrast to previous explanations, we argue that the NAA's spatial association with the hot spot track is coincidental and that it is caused by small-scale upwelling associated with an eddy in the asthenospheric flow field at the continental margin. That the NAA is just one of several low-velocity features along the eastern margin of North America suggests that this process may be globally ubiquitous.
NASA Astrophysics Data System (ADS)
La Femina, P. C.; Geirsson, H.; Saballos, A.; Mattioli, G. S.
2017-12-01
A long-standing paradigm in plate tectonics is that oblique convergence results in strain partitioning and the formation of migrating fore-arc terranes accommodated on margin-parallel strike-slip faults within or in close proximity to active volcanic arcs (e.g., the Sumatran fault). Some convergent margins, however, are segmented by margin-normal faults and margin-parallel shear is accommodated by motion on these faults and by vertical axis block rotation. Furthermore, geologic and geophysical observations of active and extinct margins where strain partitioning has occurred, indicate the emplacement of magmas within the shear zones or extensional step-overs. Characterizing the mechanism of accommodation is important for understanding short-term (decadal) seismogenesis, and long-term (millions of years) fore-arc migration, and the formation of continental lithosphere. We investigate the geometry and kinematics of Quaternary faulting and magmatism along the Nicaraguan convergent margin, where historical upper crustal earthquakes have been located on margin-normal, strike-slip faults within the fore arc and arc. Using new GPS time series, other geophysical and geologic data, we: 1) determine the location of the maximum gradient in forearc motion; 2) estimate displacement rates on margin-normal faults; and 3) constrain the geometric moment rate for the fault system. We find that: 1) forearc motion is 11 mm a-1; 2) deformation is accommodated within the active volcanic arc; and 3) that margin-normal faults can have rates of 10 mm a-1 in agreement with geologic estimates from paleoseismology. The minimum geometric moment rate for the margin-normal fault system is 2.62x107 m3 yr-1, whereas the geometric moment rate for historical (1931-2006) earthquakes is 1.01x107 m3/yr. The discrepancy between fore-arc migration and historical seismicity may be due to aseismic accommodation of fore-arc motion by magmatic intrusion along north-trending volcanic alignments within the volcanic arc.
Is Ki67 prognostic for aggressive prostate cancer? A multicenter real-world study.
Fantony, Joseph J; Howard, Lauren E; Csizmadi, Ilona; Armstrong, Andrew J; Lark, Amy L; Galet, Colette; Aronson, William J; Freedland, Stephen J
2018-06-15
To test if Ki67 expression is prognostic for biochemical recurrence (BCR) after radical prostatectomy (RP). Ki67 immunohistochemistry was performed on tissue microarrays constructed from specimens obtained from 464 men undergoing RP at the Durham and West LA Veterans Affairs Hospitals. Hazard ratios (HR) for Ki67 expression and time to BCR were estimated using Cox regression. Ki67 was associated with more recent surgery year (p < 0.001), positive margins (p = 0.001) and extracapsular extension (p < 0.001). In center-stratified analyses, the adjusted HR for Ki67 expression and BCR approached statistical significance for west LA (HR: 1.54; p = 0.06), but not Durham (HR: 1.10; p = 0.74). This multi-institutional 'real-world' study provides limited evidence for the prognostic role of Ki67 in predicting outcome after RP.
cDNA microarray analysis of esophageal cancer: discoveries and prospects.
Shimada, Yutaka; Sato, Fumiaki; Shimizu, Kazuharu; Tsujimoto, Gozoh; Tsukada, Kazuhiro
2009-07-01
Recent progress in molecular biology has revealed many genetic and epigenetic alterations that are involved in the development and progression of esophageal cancer. Microarray analysis has also revealed several genetic networks that are involved in esophageal cancer. However, clinical application of microarray techniques and use of microarray data have not yet occurred. In this review, we focus on the recent developments and problems with microarray analysis of esophageal cancer.
Petersen, David W; Kawasaki, Ernest S
2007-01-01
DNA microarray technology has become a powerful tool in the arsenal of the molecular biologist. Capitalizing on high precision robotics and the wealth of DNA sequences annotated from the genomes of a large number of organisms, the manufacture of microarrays is now possible for the average academic laboratory with the funds and motivation. Microarray production requires attention to both biological and physical resources, including DNA libraries, robotics, and qualified personnel. While the fabrication of microarrays is a very labor-intensive process, production of quality microarrays individually tailored on a project-by-project basis will help researchers shed light on future scientific questions.
Killion, Patrick J; Sherlock, Gavin; Iyer, Vishwanath R
2003-01-01
Background The power of microarray analysis can be realized only if data is systematically archived and linked to biological annotations as well as analysis algorithms. Description The Longhorn Array Database (LAD) is a MIAME compliant microarray database that operates on PostgreSQL and Linux. It is a fully open source version of the Stanford Microarray Database (SMD), one of the largest microarray databases. LAD is available at Conclusions Our development of LAD provides a simple, free, open, reliable and proven solution for storage and analysis of two-color microarray data. PMID:12930545
Clinical study on natural gingival color.
Gómez-Polo, Cristina; Montero, Javier; Gómez-Polo, Miguel; Martín Casado, Ana María
2018-05-29
The aims of the study were: to describe the gingival color surrounding the upper incisors in three sites in the keratinized gingiva, analyzing the effect of possible factors which modulate (socio-demographic and behavioral) intersubject variability; to study whether the gingiva color is the same in all three locations and to describe intrasubject color differences in the keratinized gingiva band. Using the CIELAB color system, three reference areas (free gingival margin, keratinized gingival body, and birth or upper part of the keratinized gingiva) were studied in 259 individuals, as well as the related socio-demographic factors, oral habits and the chronic intake of medication. Shadepilot™ spectrophotometer was used. Descriptive and inferential statistical analysis was performed. There are statistically significant differences between males and females for coordinates L* and a* in the middle and free gingival margin. For the b* coordinate, there are differences between males and females in the three locations studied (p < 0.05). The minimum and maximum coordinates in which the CIELAB natural gingival space is delimited are L* minima 28.3, L* maximum 65.4, a* minimum 11.1, a* maximum 37.2, b* minimum 6.9, and b* maximum 25.2*. Age, smoking, and the chronic intake of medication had no significant effect on gum color. There are perceptible color differences within the keratinized gingiva band. These chromatic differences must be taken into account if the prosthetic characterization of gingival tissue is to be considered acceptable. There are significant differences between the color coordinates of the three sites studied in the keratinized gingiva of men and women.
Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.
Jones, Rachael M; Nicas, Mark
2006-03-01
COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always <100, with some ranging to <1, and inversely related to molecular weight. The Toolkit-GHS system generally produced margins equal to or larger than COSHH Essentials, suggesting that the Toolkit-GHS system is more protective of worker health. Although, these systems predict exposures comparable with current occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.
A Java-based tool for the design of classification microarrays.
Meng, Da; Broschat, Shira L; Call, Douglas R
2008-08-04
Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for analysis of subsequent experimental data. Additionally, PLASMID can be used to construct virtual microarrays with genomes from public databases, which can then be used to identify an optimal set of probes.
THE ABRF MARG MICROARRAY SURVEY 2005: TAKING THE PULSE ON THE MICROARRAY FIELD
Over the past several years microarray technology has evolved into a critical component of any discovery based program. Since 1999, the Association of Biomolecular Resource Facilities (ABRF) Microarray Research Group (MARG) has conducted biennial surveys designed to generate a pr...
Development of a Digital Microarray with Interferometric Reflectance Imaging
NASA Astrophysics Data System (ADS)
Sevenler, Derin
This dissertation describes a new type of molecular assay for nucleic acids and proteins. We call this technique a digital microarray since it is conceptually similar to conventional fluorescence microarrays, yet it performs enumerative ('digital') counting of the number captured molecules. Digital microarrays are approximately 10,000-fold more sensitive than fluorescence microarrays, yet maintain all of the strengths of the platform including low cost and high multiplexing (i.e., many different tests on the same sample simultaneously). Digital microarrays use gold nanorods to label the captured target molecules. Each gold nanorod on the array is individually detected based on its light scattering, with an interferometric microscopy technique called SP-IRIS. Our optimized high-throughput version of SP-IRIS is able to scan a typical array of 500 spots in less than 10 minutes. Digital DNA microarrays may have utility in applications where sequencing is prohibitively expensive or slow. As an example, we describe a digital microarray assay for gene expression markers of bacterial drug resistance.
Implementation of mutual information and bayes theorem for classification microarray data
NASA Astrophysics Data System (ADS)
Dwifebri Purbolaksono, Mahendra; Widiastuti, Kurnia C.; Syahrul Mubarok, Mohamad; Adiwijaya; Aminy Ma’ruf, Firda
2018-03-01
Microarray Technology is one of technology which able to read the structure of gen. The analysis is important for this technology. It is for deciding which attribute is more important than the others. Microarray technology is able to get cancer information to diagnose a person’s gen. Preparation of microarray data is a huge problem and takes a long time. That is because microarray data contains high number of insignificant and irrelevant attributes. So, it needs a method to reduce the dimension of microarray data without eliminating important information in every attribute. This research uses Mutual Information to reduce dimension. System is built with Machine Learning approach specifically Bayes Theorem. This theorem uses a statistical and probability approach. By combining both methods, it will be powerful for Microarray Data Classification. The experiment results show that system is good to classify Microarray data with highest F1-score using Bayesian Network by 91.06%, and Naïve Bayes by 88.85%.
Zhao, Yuanshun; Zhang, Yonghong; Lin, Dongdong; Li, Kang; Yin, Chengzeng; Liu, Xiuhong; Jin, Boxun; Sun, Libo; Liu, Jinhua; Zhang, Aiying; Li, Ning
2015-10-01
To develop and evaluate a protein microarray assay with horseradish peroxidase (HRP) chemiluminescence for quantification of α-fetoprotein (AFP) in serum from patients with hepatocellular carcinoma (HCC). A protein microarray assay for AFP was developed. Serum was collected from patients with HCC and healthy control subjects. AFP was quantified using protein microarray and enzyme-linked immunosorbent assay (ELISA). Serum AFP concentrations determined via protein microarray were positively correlated (r = 0.973) with those determined via ELISA in patients with HCC (n = 60) and healthy control subjects (n = 30). Protein microarray showed 80% sensitivity and 100% specificity for HCC diagnosis. ELISA had 83.3% sensitivity and 100% specificity. Protein microarray effectively distinguished between patients with HCC and healthy control subjects (area under ROC curve 0.974; 95% CI 0.000, 1.000). Protein microarray is a rapid, simple and low-cost alternative to ELISA for detecting AFP in human serum. © The Author(s) 2015.
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2012-01-01
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694
Isehed, Catrine; Holmlund, Anders; Renvert, Stefan; Svenson, Björn; Johansson, Ingegerd; Lundberg, Pernilla
2016-10-01
This randomized clinical trial aimed at comparing radiological, clinical and microbial effects of surgical treatment of peri-implantitis alone or in combination with enamel matrix derivative (EMD). Twenty-six subjects were treated with open flap debridement and decontamination of the implant surfaces with gauze and saline preceding adjunctive EMD or no EMD. Bone level (BL) change was primary outcome and secondary outcomes were changes in pocket depth (PD), plaque, pus, bleeding and the microbiota of the peri-implant biofilm analyzed by the Human Oral Microbe Identification Microarray over a time period of 12 months. In multivariate modelling, increased marginal BL at implant site was significantly associated with EMD, the number of osseous walls in the peri-implant bone defect and a Gram+/aerobic microbial flora, whereas reduced BL was associated with a Gram-/anaerobic microbial flora and presence of bleeding and pus, with a cross-validated predictive capacity (Q(2) ) of 36.4%. Similar, but statistically non-significant, trends were seen for BL, PD, plaque, pus and bleeding in univariate analysis. Adjunctive EMD to surgical treatment of peri-implantitis was associated with prevalence of Gram+/aerobic bacteria during the follow-up period and increased marginal BL 12 months after treatment. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
26 CFR 1.994-2 - Marginal costing rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... labor 20.00 (iii) Total deductions 60.00 (c) Maximum combined taxable income 25.00 (4) Overall profit... qualify as export promotion expenses may be so claimed as export promotion expenses. (3) Overall profit... (determined under § 1.993-6) of the DISC derived from such sales, multiplied by the overall profit percentage...
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Item Response Theory with Estimation of the Latent Density Using Davidian Curves
ERIC Educational Resources Information Center
Woods, Carol M.; Lin, Nan
2009-01-01
Davidian-curve item response theory (DC-IRT) is introduced, evaluated with simulations, and illustrated using data from the Schedule for Nonadaptive and Adaptive Personality Entitlement scale. DC-IRT is a method for fitting unidimensional IRT models with maximum marginal likelihood estimation, in which the latent density is estimated,…
Self-Reported Well-Being of Women and Men with Intellectual Disabilities in England
ERIC Educational Resources Information Center
Emerson, Eric; Hatton, Chris
2008-01-01
We investigated the association between indicators of subjective well-being and the personal characteristics, socioeconomic position, and social relationships of a sample of 1,273 English adults with intellectual disabilities. Mean overall happiness with life was 71% of the scale maximum, a figure only marginally lower than typically reported…
Semiparametric Item Response Functions in the Context of Guessing
ERIC Educational Resources Information Center
Falk, Carl F.; Cai, Li
2016-01-01
We present a logistic function of a monotonic polynomial with a lower asymptote, allowing additional flexibility beyond the three-parameter logistic model. We develop a maximum marginal likelihood-based approach to estimate the item parameters. The new item response model is demonstrated on math assessment data from a state, and a computationally…
Pre-gastrula expression of zebrafish extraembryonic genes
2010-01-01
Background Many species form extraembryonic tissues during embryogenesis, such as the placenta of humans and other viviparous mammals. Extraembryonic tissues have various roles in protecting, nourishing and patterning embryos. Prior to gastrulation in zebrafish, the yolk syncytial layer - an extraembryonic nuclear syncytium - produces signals that induce mesoderm and endoderm formation. Mesoderm and endoderm precursor cells are situated in the embryonic margin, an external ring of cells along the embryo-yolk interface. The yolk syncytial layer initially forms below the margin, in a domain called the external yolk syncytial layer (E-YSL). Results We hypothesize that key components of the yolk syncytial layer's mesoderm and endoderm inducing activity are expressed as mRNAs in the E-YSL. To identify genes expressed in the E-YSL, we used microarrays to compare the transcription profiles of intact pre-gastrula embryos with pre-gastrula embryonic cells that we had separated from the yolk and yolk syncytial layer. This identified a cohort of genes with enriched expression in intact embryos. Here we describe our whole mount in situ hybridization analysis of sixty-eight of them. This includes ten genes with E-YSL expression (camsap1l1, gata3, znf503, hnf1ba, slc26a1, slc40a1, gata6, gpr137bb, otop1 and cebpa), four genes with expression in the enveloping layer (EVL), a superficial epithelium that protects the embryo (zgc:136817, zgc:152778, slc14a2 and elovl6l), three EVL genes whose expression is transiently confined to the animal pole (elovl6l, zgc:136359 and clica), and six genes with transient maternal expression (mtf1, wu:fj59f04, mospd2, rftn2, arrdc1a and pho). We also assessed the requirement of Nodal signaling for the expression of selected genes in the E-YSL, EVL and margin. Margin expression was Nodal dependent for all genes we tested, including the concentrated margin expression of an EVL gene: zgc:110712. All other instances of EVL and E-YSL expression that we tested were Nodal independent. Conclusion We have devised an effective strategy for enriching and identifying genes expressed in the E-YSL of pre-gastrula embryos. To our surprise, maternal genes and genes expressed in the EVL were also enriched by this strategy. A number of these genes are promising candidates for future functional studies on early embryonic patterning. PMID:20423468
Blakely, Richard J.
1981-01-01
Estimations of the depth to magnetic sources using the power spectrum of magnetic anomalies generally require long magnetic profiles. The method developed here uses the maximum entropy power spectrum (MEPS) to calculate depth to source on short windows of magnetic data; resolution is thereby improved. The method operates by dividing a profile into overlapping windows, calculating a maximum entropy power spectrum for each window, linearizing the spectra, and calculating with least squares the various depth estimates. The assumptions of the method are that the source is two dimensional and that the intensity of magnetization includes random noise; knowledge of the direction of magnetization is not required. The method is applied to synthetic data and to observed marine anomalies over the Peru-Chile Trench. The analyses indicate a continuous magnetic basement extending from the eastern margin of the Nazca plate and into the subduction zone. The computed basement depths agree with acoustic basement seaward of the trench axis, but deepen as the plate approaches the inner trench wall. This apparent increase in the computed depths may result from the deterioration of magnetization in the upper part of the ocean crust, possibly caused by compressional disruption of the basaltic layer. Landward of the trench axis, the depth estimates indicate possible thrusting of the oceanic material into the lower slope of the continental margin.
Brouwers, E.M.; Jorgensen, N.O.; Cronin, T. M.
1991-01-01
The Kap Kobenhavn Formation crops out in Greenland at 80??N latitude and marks the most northerly onshore Pliocene locality known. The sands and silts that comprise the formation were deposited in marginal marine and shallow marine environments. An abundant and diverse vertebrate and invertebrate fauna and plant megafossil flora provide age and paleoclimatic constraints. The age estimated for the Kap Kobenhavn ranges from 2.0 to 3.0 million years old. Winter and summer bottom water paleotemperatures were estimated on the basis of the ostracode assemblages. The marine ostracode fauna in units B1 and B2 indicate a subfrigid to frigid marine climate, with estimated minimum sea bottom temperatures (SBT) of -2??C and estimated maximum SBT of 6-8??C. Sediments assigned to unit B2 at locality 72 contain a higher proportion of warm water genera, and the maximum SBT is estimated at 9-10??C. The marginal marine fauna in the uppermost unit B3 (locality 68) indicates a cold temperate to subfrigid marine climate, with an estimated minimum SBT of -2??C and an estimated maximum SBT ranging as high as 12-14??C. These temperatures indicated that, on the average, the Kap Kobenhavn winters in the late Pliocene were similar to or perhaps 1-2??C warmer than winters today and that summer temperatures were 7-8??C warmer than today. -from Authors
CT differentiation of 1-2-cm gallbladder polyps: benign vs malignant.
Song, E Rang; Chung, Woo-Suk; Jang, Hye Young; Yoon, Minjae; Cha, Eun Jung
2014-04-01
To evaluate MDCT findings of 1-2-cm sized gallbladder (GB) polyps for differentiation between benign and malignant polyps. Institutional review board approval was obtained, and informed consent was waived. Portal venous phase CT scans of 1-2-cm sized GB polyps caused by various pathologic conditions were retrospectively reviewed by two blinded observers. Among the 36 patients identified, 21 had benign polyps with the remaining 15 having malignant polyps. Size, margin, and shape of GB polyps were evaluated. Attenuation values of the polyps, including mean attenuation, maximum attenuation, and standard deviation, were recorded. As determined by visual inspection, the degree of polyp enhancement was evaluated. Using these CT findings, each of the two radiologists assessed and recorded individual diagnostic confidence for differentiating benign versus malignant polyps on a 5-point scale. The diagnostic performance of CT was evaluated using a receiver operating characteristic curve analysis. There was no significant difference in size between benign and malignant GB polyps. Ill-defined margin and sessile morphology were significantly associated with malignant polyp. There was a significant difference in mean and maximum attenuation values between benign and malignant GB polyps. Mean standard deviation value of malignant polyps was significantly higher than that of benign polyps. All malignant polyps showed either hyperenhancement or marked hyperenhancement. A z value for the diagnosis of malignant GB polyps was 0.905. Margin, shape, and enhancement degree are helpful in differentiating between benign and malignant polyps of 1-2-cm sizes.
Polgar, Gianluca; Khang, Tsung Fei; Chua, Teddy; Marshall, David J
2015-01-01
The relationship between acute thermal tolerance and habitat temperature in ectotherm animals informs about their thermal adaptation and is used to assess thermal safety margins and sensitivity to climate warming. We studied this relationship in an equatorial freshwater snail (Clea nigricans), belonging to a predominantly marine gastropod lineage (Neogastropoda, Buccinidae). We found that tolerance of heating and cooling exceeded average daily maximum and minimum temperatures, by roughly 20°C in each case. Because habitat temperature is generally assumed to be the main selective factor acting on the fundamental thermal niche, the discordance between thermal tolerance and environmental temperature implies trait conservation following 'in situ' environmental change, or following novel colonisation of a thermally less-variable habitat. Whereas heat tolerance could relate to an historical association with the thermally variable and extreme marine intertidal fringe zone, cold tolerance could associate with either an ancestral life at higher latitudes, or represent adaptation to cooler, higher-altitudinal, tropical lotic systems. The broad upper thermal safety margin (difference between heat tolerance and maximum environmental temperature) observed in this snail is grossly incompatible with the very narrow safety margins typically found in most terrestrial tropical ectotherms (insects and lizards), and hence with the emerging prediction that tropical ectotherms, are especially vulnerable to environmental warming. A more comprehensive understanding of climatic vulnerability of animal ectotherms thus requires greater consideration of taxonomic diversity, ecological transition and evolutionary history. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Microarray Revolution: Perspectives from Educators
ERIC Educational Resources Information Center
Brewster, Jay L.; Beason, K. Beth; Eckdahl, Todd T.; Evans, Irene M.
2004-01-01
In recent years, microarray analysis has become a key experimental tool, enabling the analysis of genome-wide patterns of gene expression. This review approaches the microarray revolution with a focus upon four topics: 1) the early development of this technology and its application to cancer diagnostics; 2) a primer of microarray research,…
The Seismicity of Two Hyperextended Margins
NASA Astrophysics Data System (ADS)
Redfield, Tim; Terje Osmundsen, Per
2013-04-01
A seismic belt marks the outermost edge of Scandinavia's proximal margin, inboard of and roughly parallel to the Taper Break. A similar near- to onshore seismic belt runs along its inner edge, roughly parallel to and outboard of the asymmetric, seaward-facing escarpment. The belts converge at both the northern and southern ends of Scandinavia, where crustal taper is sharp and the proximal margin is narrow. Very few seismic events have been recorded on the intervening, gently-tapering Trøndelag Platform. Norway's distribution of seismicity is systematically ordered with respect to 1) the structural templates of high-beta extension that shaped the thinning gradient during Late Jurassic or Early Cretaceous time, and 2) the topographically resurgent Cretaceous-Cenozoic "accommodation phase" family of escarpments that approximate the innermost limit of crustal thinning [See Redfield and Osmundsen (2012) for diagrams, definitions, discussion, and supporting citations.] Landwards from the belt of earthquake epicenters that mark the Taper Break the crust consistently thickens, and large fault arrays tend to sole out at mid crustal levels. Towards the sea the crystalline continental crust is hyperextended, pervasively faulted, and generally very thin. Also, faulting and serpentinization may have affected the uppermost parts of the distal margin's lithospheric mantle. Such contrasting structural conditions may generate a contrasting stiffness: for a given stress, more strain can be accommodated in the distal margin than in the less faulted proximal margin. By way of comparison, inboard of the Taper Break on the gently-tapered Trøndelag Platform, faulting was not penetrative. There, similar structural conditions prevail and proximal margin seismicity is negligible. Because stress concentration can occur where material properties undergo significant contrast, the necking zone may constitute a natural localization point for post-thinning phase earthquakes. In Scandinavia, loads generated by escarpment erosion, offshore sedimentary deposition, and post-glacial rebound have been periodically superimposed throughout the Neogene. Their vertical stress patterns are mutually-reinforcing during deglaciation. However, compared to the post-glacial dome the pattern of maximum uplift/unloading generated by escarpment erosion will be longer, more linear, and located atop the emergent proximal margin. The pattern of offshore maximum deposition/loading will be similar. This may help explain the asymmetric expenditure of Fennoscandia's annual seismic energy budget. It may also help explain the obvious Conundrum: if stress generated by erosion and deposition is sufficiently great, fault reactivation and consequent seismicity can occur at any hyperextended passive margin sector regardless of its glacial history. Onshore Scandinavia, episodic footwall uplift and escarpment rejuvenation may have been driven by just such a mechanism throughout much of the later Cretaceous and Cenozoic. SE Brasil offers a glimpse of how Norway's hyperextended margin might manifest itself seismically in the absence of post-glacial rebound. Compilations suggest two seismic belts may exist. One, offshore, follows the thinned crust of the ultra-deep, hyperextended Campos and Santos basins. Onshore, earthquakes occur more commonly in the elevated highlands of the escarpments, and track especially the long, linear ranges such as the Serra de Mantiquiera and Serra do Espinhaço. Seismicity is more rare in the coastal lowlands, and largely absent in the Brasilian hinterland. Although never glaciated since the time of hyperextension and characterized by significantly fewer earthquakes in toto, SE Brasil's pattern of seismicity closely mimics Scandinavia. Commencing after perhaps just a few tens of millions of years of 'sag' basin infill, accommodation phase fault reactivation and footwall uplift at passive margins is the inexorable product of hyperextension. CITATIONS Redfield, T.F. and P.T. Osmundsen, 2012, GSA Bulletin, doi: 10.1130/B30691.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, X; Yang, Y; Jack, N
Purpose: On-board MRI provides superior soft-tissue contrast, allowing patient alignment using tumor or nearby critical structures. This study aims to study H&N MRI-guided IGRT to analyze inter-fraction patient setup variations using soft-tissue targets and design appropriate CTV-to-PTV margin and clinical implication. Methods: 282 MR images for 10 H&N IMRT patients treated on a ViewRay system were retrospectively analyzed. Patients were immobilized using a thermoplastic mask on a customized headrest fitted in a radiofrequency coil and positioned to soft-tissue targets. The inter-fraction patient displacements were recorded to compute the PTV margins using the recipe: 2.5∑+0.7σ. New IMRT plans optimized on themore » revised PTVs were generated to evaluate the delivered dose distributions. An in-house dose deformation registration tool was used to assess the resulting dosimetric consequences when margin adaption is performed based on weekly MR images. The cumulative doses were compared to the reduced margin plans for targets and critical structures. Results: The inter-fraction displacements (and standard deviations), ∑ and σ were tabulated for MRI and compared to kVCBCT. The computed CTV-to-PTV margin was 3.5mm for soft-tissue based registration. There were minimal differences between the planned and delivered doses when comparing clinical and the PTV reduced margin plans: the paired t-tests yielded p=0.38 and 0.66 between the planned and delivered doses for the adapted margin plans for the maximum cord and mean parotid dose, respectively. Target V95 received comparable doses as planned for the reduced margin plans. Conclusion: The 0.35T MRI offers acceptable soft-tissue contrast and good spatial resolution for patient alignment and target visualization. Better tumor conspicuity from MRI allows soft-tissue based alignments with potentially improved accuracy, suggesting a benefit of margin reduction for H&N radiotherapy. The reduced margin plans (i.e., 2 mm) resulted in improved normal structure sparing and accurate dose delivery to achieve intended treatment goal under MR guidance.« less
Recent progress in making protein microarray through BioLP
NASA Astrophysics Data System (ADS)
Yang, Rusong; Wei, Lian; Feng, Ying; Li, Xiujian; Zhou, Quan
2017-02-01
Biological laser printing (BioLP) is a promising biomaterial printing technique. It has the advantage of high resolution, high bioactivity, high printing frequency and small transported liquid amount. In this paper, a set of BioLP device is design and made, and protein microarrays are printed by this device. It's found that both laser intensity and fluid layer thickness have an influence on the microarrays acquired. Besides, two kinds of the fluid layer coating methods are compared, and the results show that blade coating method is better than well-coating method in BioLP. A microarray of 0.76pL protein microarray and a "NUDT" patterned microarray are printed to testify the printing ability of BioLP.
A comparison of pay-as-bid and marginal pricing in electricity markets
NASA Astrophysics Data System (ADS)
Ren, Yongjun
This thesis investigates the behaviour of electricity markets under marginal and pay-as-bid pricing. Marginal pricing is believed to yield the maximum social welfare and is currently implemented by most electricity markets. However, in view of recent electricity market failures, pay-as-bid has been extensively discussed as a possible alternative to marginal pricing. In this research, marginal and pay-as-bid pricing have been analyzed in electricity markets with both perfect and imperfect competition. The perfect competition case is studied under both exact and uncertain system marginal cost prediction. The comparison of the two pricing methods is conducted through two steps: (i) identify the best offer strategy of the generating companies (gencos); (ii) analyze the market performance under these optimum genco strategies. The analysis results together with numerical simulations show that pay-as-bid and marginal pricing are equivalent in a perfect market with exact system marginal cost prediction. In perfect markets with uncertain demand prediction, the two pricing methods are also equivalent but in an expected value sense. If we compare from the perspective of second order statistics, all market performance measures exhibit much lower values under pay-as-bid than under marginal pricing. The risk of deviating from the mean is therefore much higher under marginal pricing than under pay-as-bid. In an imperfect competition market with exact demand prediction, the research shows that pay-as-bid pricing yields lower consumer payments and lower genco profits. This research provides quantitative evidence that challenges some common claims about pay-as-bid pricing. One is that under pay-as-bid, participants would soon learn how to offer so as to obtain the same or higher profits than what they would have obtained under marginal pricing. This research however shows that, under pay-as-bid, participants can at best earn the same profit or expected profit as under marginal pricing. A second common claim refuted by this research is that pay-as-bid does not provide correct price signals if there is a scarcity of generation resources. We show that pay-as-bid does provide a price signal with such characteristics and furthermore argue that the price signal under marginal pricing with gaming may not necessarily be correct since it would then not reflect a lack of generation capacity but a desire to increase profit.
The second phase of the MicroArray Quality Control (MAQC-II) project evaluated common practices for developing and validating microarray-based models aimed at predicting toxicological and clinical endpoints. Thirty-six teams developed classifiers for 13 endpoints - some easy, som...
Flow-pattern Guided Fabrication of High-density Barcode Antibody Microarray
Ramirez, Lisa S.; Wang, Jun
2016-01-01
Antibody microarray as a well-developed technology is currently challenged by a few other established or emerging high-throughput technologies. In this report, we renovate the antibody microarray technology by using a novel approach for manufacturing and by introducing new features. The fabrication of our high-density antibody microarray is accomplished through perpendicularly oriented flow-patterning of single stranded DNAs and subsequent conversion mediated by DNA-antibody conjugates. This protocol outlines the critical steps in flow-patterning DNA, producing and purifying DNA-antibody conjugates, and assessing the quality of the fabricated microarray. The uniformity and sensitivity are comparable with conventional microarrays, while our microarray fabrication does not require the assistance of an array printer and can be performed in most research laboratories. The other major advantage is that the size of our microarray units is 10 times smaller than that of printed arrays, offering the unique capability of analyzing functional proteins from single cells when interfacing with generic microchip designs. This barcode technology can be widely employed in biomarker detection, cell signaling studies, tissue engineering, and a variety of clinical applications. PMID:26780370
Microarray platform for omics analysis
NASA Astrophysics Data System (ADS)
Mecklenburg, Michael; Xie, Bin
2001-09-01
Microarray technology has revolutionized genetic analysis. However, limitations in genome analysis has lead to renewed interest in establishing 'omic' strategies. As we enter the post-genomic era, new microarray technologies are needed to address these new classes of 'omic' targets, such as proteins, as well as lipids and carbohydrates. We have developed a microarray platform that combines self- assembling monolayers with the biotin-streptavidin system to provide a robust, versatile immobilization scheme. A hydrophobic film is patterned on the surface creating an array of tension wells that eliminates evaporation effects thereby reducing the shear stress to which biomolecules are exposed to during immobilization. The streptavidin linker layer makes it possible to adapt and/or develop microarray based assays using virtually any class of biomolecules including: carbohydrates, peptides, antibodies, receptors, as well as them ore traditional DNA based arrays. Our microarray technology is designed to furnish seamless compatibility across the various 'omic' platforms by providing a common blueprint for fabricating and analyzing arrays. The prototype microarray uses a microscope slide footprint patterned with 2 by 96 flat wells. Data on the microarray platform will be presented.
A class of optimum digital phase locked loops
NASA Technical Reports Server (NTRS)
Kumar, R.; Hurd, W. J.
1986-01-01
This paper presents a class of optimum digital filters for digital phase locked loops, for the important case in which the maximum update rate of the loop filter and numerically controlled oscillator (NCO) is limited. This case is typical when the loop filter is implemented in a microprocessor. In these situations, pure delay is encountered in the loop transfer function and thus the stability and gain margin of the loop are of crucial interest. The optimum filters designed for such situations are evaluated in terms of their gain margin for stability, dynamic error, and steady-state error performance. For situations involving considerably high phase dynamics an adaptive and programmable implementation is also proposed to obtain an overall optimum strategy.
Overdentures on natural teeth: a new approach.
Previgliano, V; Barone Monfrin, S; Santià, G; Preti, G
2004-01-01
The study presents a new type of copings for overdentures on natural teeth. A new type of custom-made copings was prepared on 10 extracted teeth and their marginal fit was observed microscopically by means of a mechanical device, and software was employed to measure the gap. The marginal fit evaluation gave satisfactory values with mean values of the gap measurements below the clinically accepted limits (mean gap: 25.3 microm; minimum 7.3 microm, maximum 56.5 microm). The advantages of these new copings are: the rapidity of their preparation; the protection of the root canal treatment, because the coping with this chair-side method is prepared and cemented in one session; the low costs.
Seefeld, Ting H.; Halpern, Aaron R.; Corn, Robert M.
2012-01-01
Protein microarrays are fabricated from double-stranded DNA (dsDNA) microarrays by a one-step, multiplexed enzymatic synthesis in an on-chip microfluidic format and then employed for antibody biosensing measurements with surface plasmon resonance imaging (SPRI). A microarray of dsDNA elements (denoted as generator elements) that encode either a His-tagged green fluorescent protein (GFP) or a His-tagged luciferase protein is utilized to create multiple copies of messenger RNA (mRNA) in a surface RNA polymerase reaction; the mRNA transcripts are then translated into proteins by cell-free protein synthesis in a microfluidic format. The His-tagged proteins diffuse to adjacent Cu(II)-NTA microarray elements (denoted as detector elements) and are specifically adsorbed. The net result is the on-chip, cell-free synthesis of a protein microarray that can be used immediately for SPRI protein biosensing. The dual element format greatly reduces any interference from the nonspecific adsorption of enzyme or proteins. SPRI measurements for the detection of the antibodies anti-GFP and anti-luciferase were used to verify the formation of the protein microarray. This convenient on-chip protein microarray fabrication method can be implemented for multiplexed SPRI biosensing measurements in both clinical and research applications. PMID:22793370
Fully Automated Complementary DNA Microarray Segmentation using a Novel Fuzzy-based Algorithm.
Saberkari, Hamidreza; Bahrami, Sheyda; Shamsi, Mousa; Amoshahy, Mohammad Javad; Ghavifekr, Habib Badri; Sedaaghi, Mohammad Hossein
2015-01-01
DNA microarray is a powerful approach to study simultaneously, the expression of 1000 of genes in a single experiment. The average value of the fluorescent intensity could be calculated in a microarray experiment. The calculated intensity values are very close in amount to the levels of expression of a particular gene. However, determining the appropriate position of every spot in microarray images is a main challenge, which leads to the accurate classification of normal and abnormal (cancer) cells. In this paper, first a preprocessing approach is performed to eliminate the noise and artifacts available in microarray cells using the nonlinear anisotropic diffusion filtering method. Then, the coordinate center of each spot is positioned utilizing the mathematical morphology operations. Finally, the position of each spot is exactly determined through applying a novel hybrid model based on the principle component analysis and the spatial fuzzy c-means clustering (SFCM) algorithm. Using a Gaussian kernel in SFCM algorithm will lead to improving the quality in complementary DNA microarray segmentation. The performance of the proposed algorithm has been evaluated on the real microarray images, which is available in Stanford Microarray Databases. Results illustrate that the accuracy of microarray cells segmentation in the proposed algorithm reaches to 100% and 98% for noiseless/noisy cells, respectively.
Zhang, Aiying; Yin, Chengzeng; Wang, Zhenshun; Zhang, Yonghong; Zhao, Yuanshun; Li, Ang; Sun, Huanqin; Lin, Dongdong; Li, Ning
2016-12-01
Objective To develop a simple, effective, time-saving and low-cost fluorescence protein microarray method for detecting serum alpha-fetoprotein (AFP) in patients with hepatocellular carcinoma (HCC). Method Non-contact piezoelectric print techniques were applied to fluorescence protein microarray to reduce the cost of prey antibody. Serum samples from patients with HCC and healthy control subjects were collected and evaluated for the presence of AFP using a novel fluorescence protein microarray. To validate the fluorescence protein microarray, serum samples were tested for AFP using an enzyme-linked immunosorbent assay (ELISA). Results A total of 110 serum samples from patients with HCC ( n = 65) and healthy control subjects ( n = 45) were analysed. When the AFP cut-off value was set at 20 ng/ml, the fluorescence protein microarray had a sensitivity of 91.67% and a specificity of 93.24% for detecting serum AFP. Serum AFP quantified via fluorescence protein microarray had a similar diagnostic performance compared with ELISA in distinguishing patients with HCC from healthy control subjects (area under receiver operating characteristic curve: 0.906 for fluorescence protein microarray; 0.880 for ELISA). Conclusion A fluorescence protein microarray method was developed for detecting serum AFP in patients with HCC.
Zhang, Aiying; Yin, Chengzeng; Wang, Zhenshun; Zhang, Yonghong; Zhao, Yuanshun; Li, Ang; Sun, Huanqin; Lin, Dongdong
2016-01-01
Objective To develop a simple, effective, time-saving and low-cost fluorescence protein microarray method for detecting serum alpha-fetoprotein (AFP) in patients with hepatocellular carcinoma (HCC). Method Non-contact piezoelectric print techniques were applied to fluorescence protein microarray to reduce the cost of prey antibody. Serum samples from patients with HCC and healthy control subjects were collected and evaluated for the presence of AFP using a novel fluorescence protein microarray. To validate the fluorescence protein microarray, serum samples were tested for AFP using an enzyme-linked immunosorbent assay (ELISA). Results A total of 110 serum samples from patients with HCC (n = 65) and healthy control subjects (n = 45) were analysed. When the AFP cut-off value was set at 20 ng/ml, the fluorescence protein microarray had a sensitivity of 91.67% and a specificity of 93.24% for detecting serum AFP. Serum AFP quantified via fluorescence protein microarray had a similar diagnostic performance compared with ELISA in distinguishing patients with HCC from healthy control subjects (area under receiver operating characteristic curve: 0.906 for fluorescence protein microarray; 0.880 for ELISA). Conclusion A fluorescence protein microarray method was developed for detecting serum AFP in patients with HCC. PMID:27885040
García-Hoyos, María; Cortón, Marta; Ávila-Fernández, Almudena; Riveiro-Álvarez, Rosa; Giménez, Ascensión; Hernan, Inma; Carballo, Miguel; Ayuso, Carmen
2012-01-01
Purpose Presently, 22 genes have been described in association with autosomal dominant retinitis pigmentosa (adRP); however, they explain only 50% of all cases, making genetic diagnosis of this disease difficult and costly. The aim of this study was to evaluate a specific genotyping microarray for its application to the molecular diagnosis of adRP in Spanish patients. Methods We analyzed 139 unrelated Spanish families with adRP. Samples were studied by using a genotyping microarray (adRP). All mutations found were further confirmed with automatic sequencing. Rhodopsin (RHO) sequencing was performed in all negative samples for the genotyping microarray. Results The adRP genotyping microarray detected the mutation associated with the disease in 20 of the 139 families with adRP. As in other populations, RHO was found to be the most frequently mutated gene in these families (7.9% of the microarray genotyped families). The rate of false positives (microarray results not confirmed with sequencing) and false negatives (mutations in RHO detected with sequencing but not with the genotyping microarray) were established, and high levels of analytical sensitivity (95%) and specificity (100%) were found. Diagnostic accuracy was 15.1%. Conclusions The adRP genotyping microarray is a quick, cost-efficient first step in the molecular diagnosis of Spanish patients with adRP. PMID:22736939
Honoré, Paul; Granjeaud, Samuel; Tagett, Rebecca; Deraco, Stéphane; Beaudoing, Emmanuel; Rougemont, Jacques; Debono, Stéphane; Hingamp, Pascal
2006-09-20
High throughput gene expression profiling (GEP) is becoming a routine technique in life science laboratories. With experimental designs that repeatedly span thousands of genes and hundreds of samples, relying on a dedicated database infrastructure is no longer an option.GEP technology is a fast moving target, with new approaches constantly broadening the field diversity. This technology heterogeneity, compounded by the informatics complexity of GEP databases, means that software developments have so far focused on mainstream techniques, leaving less typical yet established techniques such as Nylon microarrays at best partially supported. MAF (MicroArray Facility) is the laboratory database system we have developed for managing the design, production and hybridization of spotted microarrays. Although it can support the widely used glass microarrays and oligo-chips, MAF was designed with the specific idiosyncrasies of Nylon based microarrays in mind. Notably single channel radioactive probes, microarray stripping and reuse, vector control hybridizations and spike-in controls are all natively supported by the software suite. MicroArray Facility is MIAME supportive and dynamically provides feedback on missing annotations to help users estimate effective MIAME compliance. Genomic data such as clone identifiers and gene symbols are also directly annotated by MAF software using standard public resources. The MAGE-ML data format is implemented for full data export. Journalized database operations (audit tracking), data anonymization, material traceability and user/project level confidentiality policies are also managed by MAF. MicroArray Facility is a complete data management system for microarray producers and end-users. Particular care has been devoted to adequately model Nylon based microarrays. The MAF system, developed and implemented in both private and academic environments, has proved a robust solution for shared facilities and industry service providers alike.
Honoré, Paul; Granjeaud, Samuel; Tagett, Rebecca; Deraco, Stéphane; Beaudoing, Emmanuel; Rougemont, Jacques; Debono, Stéphane; Hingamp, Pascal
2006-01-01
Background High throughput gene expression profiling (GEP) is becoming a routine technique in life science laboratories. With experimental designs that repeatedly span thousands of genes and hundreds of samples, relying on a dedicated database infrastructure is no longer an option. GEP technology is a fast moving target, with new approaches constantly broadening the field diversity. This technology heterogeneity, compounded by the informatics complexity of GEP databases, means that software developments have so far focused on mainstream techniques, leaving less typical yet established techniques such as Nylon microarrays at best partially supported. Results MAF (MicroArray Facility) is the laboratory database system we have developed for managing the design, production and hybridization of spotted microarrays. Although it can support the widely used glass microarrays and oligo-chips, MAF was designed with the specific idiosyncrasies of Nylon based microarrays in mind. Notably single channel radioactive probes, microarray stripping and reuse, vector control hybridizations and spike-in controls are all natively supported by the software suite. MicroArray Facility is MIAME supportive and dynamically provides feedback on missing annotations to help users estimate effective MIAME compliance. Genomic data such as clone identifiers and gene symbols are also directly annotated by MAF software using standard public resources. The MAGE-ML data format is implemented for full data export. Journalized database operations (audit tracking), data anonymization, material traceability and user/project level confidentiality policies are also managed by MAF. Conclusion MicroArray Facility is a complete data management system for microarray producers and end-users. Particular care has been devoted to adequately model Nylon based microarrays. The MAF system, developed and implemented in both private and academic environments, has proved a robust solution for shared facilities and industry service providers alike. PMID:16987406
Microarrays in brain research: the good, the bad and the ugly.
Mirnics, K
2001-06-01
Making sense of microarray data is a complex process, in which the interpretation of findings will depend on the overall experimental design and judgement of the investigator performing the analysis. As a result, differences in tissue harvesting, microarray types, sample labelling and data analysis procedures make post hoc sharing of microarray data a great challenge. To ensure rapid and meaningful data exchange, we need to create some order out of the existing chaos. In these ground-breaking microarray standardization and data sharing efforts, NIH agencies should take a leading role
Ramsay-Curve Item Response Theory for the Three-Parameter Logistic Item Response Model
ERIC Educational Resources Information Center
Woods, Carol M.
2008-01-01
In Ramsay-curve item response theory (RC-IRT), the latent variable distribution is estimated simultaneously with the item parameters of a unidimensional item response model using marginal maximum likelihood estimation. This study evaluates RC-IRT for the three-parameter logistic (3PL) model with comparisons to the normal model and to the empirical…
Semi-Parametric Item Response Functions in the Context of Guessing. CRESST Report 844
ERIC Educational Resources Information Center
Falk, Carl F.; Cai, Li
2015-01-01
We present a logistic function of a monotonic polynomial with a lower asymptote, allowing additional flexibility beyond the three-parameter logistic model. We develop a maximum marginal likelihood based approach to estimate the item parameters. The new item response model is demonstrated on math assessment data from a state, and a computationally…
Markov Chain Monte Carlo Estimation of Item Parameters for the Generalized Graded Unfolding Model
ERIC Educational Resources Information Center
de la Torre, Jimmy; Stark, Stephen; Chernyshenko, Oleksandr S.
2006-01-01
The authors present a Markov Chain Monte Carlo (MCMC) parameter estimation procedure for the generalized graded unfolding model (GGUM) and compare it to the marginal maximum likelihood (MML) approach implemented in the GGUM2000 computer program, using simulated and real personality data. In the simulation study, test length, number of response…
ERIC Educational Resources Information Center
Lee, Soo; Suh, Youngsuk
2018-01-01
Lord's Wald test for differential item functioning (DIF) has not been studied extensively in the context of the multidimensional item response theory (MIRT) framework. In this article, Lord's Wald test was implemented using two estimation approaches, marginal maximum likelihood estimation and Bayesian Markov chain Monte Carlo estimation, to detect…
Paiva, Thiago da Silva; Shao, Chen; Fernandes, Noemi Mendes; Borges, Bárbara do Nascimento; da Silva-Neto, Inácio Domingos
2016-01-01
Interphase specimens, aspects of physiological reorganization and divisional morphogenesis were investigated in a strain of a hypotrichous ciliate highly similar to Urostyla grandis Ehrenberg, (type species of Urostyla), collected from a mangrove area in the estuary of the Paraíba do Sul river (Rio de Janeiro, Brazil). The results revealed that albeit interphase specimens match with the known morphologic variability in U. grandis, morphogenetic processes have conspicuous differences. Parental adoral zone is entirely renewed during morphogenesis, and marginal cirri exhibit a unique combination of developmental modes, in which left marginal rows originate from multiple anlagen arising from innermost left marginal cirral row, whereas right marginal ciliature originates from individual within-row anlagen. Based on such characteristics, a new subspecies, namely U. grandis wiackowskii subsp. nov. is proposed, and consequently, U. grandis grandis Ehrenberg, stat. nov. is established. Bayesian and maximum-likelihood analyses of the 18S rDNA unambiguously placed U. grandis wiackowskii as adelphotaxon of a cluster formed by other U. grandis sequences. The implications of such findings to the systematics of Urostyla are discussed. © 2015 The Author(s) Journal of Eukaryotic Microbiology © 2015 International Society of Protistologists.
Irie, M; Suzuki, K; Watts, D C
2004-11-01
The purpose of this study was to evaluate the performance of both single and double applications of (Adper Prompt L-Pop) self-etching dental adhesive, when used with three classes of light-activated restorative materials, in comparison to the performance of each restorative system adhesive. Evaluation parameters to be considered for the adhesive systems were (a) immediate marginal adaptation (or gap formation) in tooth cavities, (b) free setting shrinkage-strain determined by the immediate marginal gap-width in a non-bonding Teflon cavity, and (c) their immediate shear bond-strengths to enamel and to dentin. The maximum marginal gap-width and the opposing-width (if any) in the tooth cavities and in the Teflon cavities were measured immediately (3 min) after light-activation. The shear bond-strengths to enamel and to dentin were also measured at 3 min. For light-activated restorative materials during early setting (<3 min), application of Adper Prompt L-Pop exhibited generally superior marginal adaptation to most system adhesives. But there was no additional benefit from double application. The marginal-gaps in tooth cavities and the marginal-gaps in Teflon cavities were highly correlated (r = 0.86-0.89, p < 0.02-0.01). For enamel and dentin shear bond-strengths, there were no significant differences between single and double applications, for all materials tested except Toughwell and Z 250 with enamel. Single application of a self-etch adhesive was a feasible and beneficial alternative to system adhesives for several classes of restorative. Marginal gap-widths in tooth cavities correlated more strongly with free shrinkage-strain magnitudes than with bond-strengths to tooth structure.
NASA Astrophysics Data System (ADS)
Tiberi, C.; Leroy, S.; d'Acremont, E.; Bellahsen, N.; Ebinger, C.; Al-Lazki, A.; Pointu, A.
2007-03-01
Here we use receiver function analysis to retrieve crustal thickness and crustal composition along the 35-My-old passive margin of the eastern Gulf of Aden. Our aims are to use results from the 3-D seismic array to map crustal stretching across and along the Aden margin in southern Oman. The array recorded local and teleseismic events between 2003 March and 2004 March. Seventy-eight events were used in our joint inversions for Vp/Vs ratio and depth. The major results are: (1) Crustal thickness decreases from the uplifted rift flank of the margin towards the Sheba mid-ocean ridge. We found a crustal thickness of about 35 km beneath the northern rift flank. This value decreases sharply to 26 km beneath the post-rift subsidence zone on the Salalah coastal plain. This 10 km of crustal thinning occurs across a horizontal distance of less than 30 km showing a localization of the crustal thinning below the first known rifted block of the margin. (2) A second rift margin transect located about 50 km to the east shows no thinning from the coast to 50 km onshore. The lack of crustal thickness variation indicates that the maximum crustal stretching could be restricted to offshore regions. (3) The along-strike variations in crustal structure demonstrate the scale and longevity of the regular along-axis rift segmentation. (4) Extension is still observed north of the rifted domain, 70 km onshore from the coast, making the width of margin larger than first expected from geology. (5) The crust has a felsic to normal composition with a probably strong effect of the sedimentary layer on the Vp/Vs ratio (comprised between 1.67 and 1.91).
Spinelli, G.A.; Field, M.E.
2003-01-01
We identify two surfaces in the shallow subsurface on the Eel River margin offshore northern California, a lowstand erosion surface, likely formed during the last glacial maximum, and an overlying surface likely formed during the most recent transgression of the shoreline. The lowstand erosion surface, which extends from the inner shelf to near the shelfbreak and from the Eel River to Trinidad Head (???80 km), truncates underlying strata on the shelf. Above the surface, inferred transgressive coastal and estuarine sedimentary units separate it from the transgressive surface on the shelf. Early in the transgression, Eel River sediment was likely both transported down the Eel Canyon and dispersed on the slope, allowing transgressive coastal sediment from the smaller Mad River to accumulate in a recognizable deposit on the shelf. The location of coastal Mad River sediment accumulation was controlled by the location of the paleo-Mad River. Throughout the remainder of the transgression, dispersed sediment from the Eel River accumulated an average of 20 m of onlapping shelf deposits. The distribution and thickness of these transgressive marine units was strongly modified by northwest-southeast trending folds. Thick sediment packages accumulated over structural lows in the lowstand surface. The thinnest sediment accumulations (0-10 m) were deposited over structural highs along faults and uplifting anticlines. The Eel margin, an active margin with steep, high sediment-load streams, has developed a thick transgressive systems tract. On this margin sediment accumulates as rapidly as the processes of uplift and downwarp locally create and destroy accommodation space. Sequence stratigraphic models of tectonically active margins should account for variations in accommodation space along margins as well as across them. ?? 2003 Elsevier Science B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Brandon T., E-mail: Brandon.Nguyen@act.gov.au; Canberra Hospital, Radiation Oncology Department, Garran, ACT; Deb, Siddhartha
Purpose: To determine an appropriate clinical target volume for partial breast radiation therapy (PBRT) based on the spatial distribution of residual invasive and in situ carcinoma after wide local excision (WLE) for early breast cancer or ductal carcinoma in situ (DCIS). Methods and Materials: We performed a prospective pathologic study of women potentially eligible for PBRT who had re-excision and/or completion mastectomy after WLE for early breast cancer or DCIS. A pathologic assessment protocol was used to determine the maximum radial extension (MRE) of residual carcinoma from the margin of the initial surgical cavity. Women were stratified by the closestmore » initial radial margin width: negative (>1 mm), close (>0 mm and {<=}1 mm), or involved. Results: The study population was composed of 133 women with a median age of 59 years (range, 27-82 years) and the following stage groups: 0 (13.5%), I (40.6%), II (38.3%), and III (7.5%). The histologic subtypes of the primary tumor were invasive ductal carcinoma (74.4%), invasive lobular carcinoma (12.0%), and DCIS alone (13.5%). Residual carcinoma was present in the re-excision and completion mastectomy specimens in 55.4%, 14.3%, and 7.2% of women with an involved, close, and negative margin, respectively. In the 77 women with a noninvolved radial margin, the MRE of residual disease, if present, was {<=}10 mm in 97.4% (95% confidence interval 91.6-99.5) of cases. Larger MRE measurements were significantly associated with an involved margin (P<.001), tumor size >30 mm (P=.03), premenopausal status (P=.03), and negative progesterone receptor status (P=.05). Conclusions: A clinical target volume margin of 10 mm would encompass microscopic residual disease in >90% of women potentially eligible for PBRT after WLE with noninvolved resection margins.« less
Limitations of the planning organ at risk volume (PRV) concept.
Stroom, Joep C; Heijmen, Ben J M
2006-09-01
Previously, we determined a planning target volume (PTV) margin recipe for geometrical errors in radiotherapy equal to M(T) = 2 Sigma + 0.7 sigma, with Sigma and sigma standard deviations describing systematic and random errors, respectively. In this paper, we investigated margins for organs at risk (OAR), yielding the so-called planning organ at risk volume (PRV). For critical organs with a maximum dose (D(max)) constraint, we calculated margins such that D(max) in the PRV is equal to the motion averaged D(max) in the (moving) clinical target volume (CTV). We studied margins for the spinal cord in 10 head-and-neck cases and 10 lung cases, each with two different clinical plans. For critical organs with a dose-volume constraint, we also investigated whether a margin recipe was feasible. For the 20 spinal cords considered, the average margin recipe found was: M(R) = 1.6 Sigma + 0.2 sigma with variations for systematic and random errors of 1.2 Sigma to 1.8 Sigma and -0.2 sigma to 0.6 sigma, respectively. The variations were due to differences in shape and position of the dose distributions with respect to the cords. The recipe also depended significantly on the volume definition of D(max). For critical organs with a dose-volume constraint, the PRV concept appears even less useful because a margin around, e.g., the rectum changes the volume in such a manner that dose-volume constraints stop making sense. The concept of PRV for planning of radiotherapy is of limited use. Therefore, alternative ways should be developed to include geometric uncertainties of OARs in radiotherapy planning.
NASA Astrophysics Data System (ADS)
Fierro, Elisa; Capitanio, Fabio A.; Schettino, Antonio; Morena Salerno, V.
2017-04-01
We use numerical modeling to investigate the coupling of mantle instabilities and surface tectonics along lithospheric steps developing during rifting. We address whether edge driven convection (EDC) beneath rifted continental margins and shear flow during rift-drift transition can play a role in the observed post-rift compressive tectonic evolution of the divergent continental margins along the Red Sea. We run a series of 2D simulations to examine the relationship between the maximum compression and key geometrical parameters of the step beneath continental margins, such as the step height due to lithosphere thickness variation and the width of the margins, and test the effect of rheology varying temperature- and stress-dependent viscosity in the lithosphere and asthenosphere. The development of instabilities is initially illustrated as a function of these parameters, to show the controls on the lithosphere strain distribution and magnitude. We then address the transient evolution of the instabilities to characterize their duration. In an additional suite of models, we address the development of EDC during plate motions, thus accounting for the mantle shearing due to spreading. Our results show an increase of strain with the step height as well as with the margin width up to 200 km. After this value the influence of ridge margin can be neglected. Strain rates are, then, quantified for a range of laboratory-constrained constitutive laws for mantle and lithosphere forming minerals. These models propose a viable mechanism to explain the post-rift tectonic inversion observed along the Arabian continental margin and the episodic ultra-fast sea floor spreading in the central Red Sea, where the role of EDC has been invoked.
NASA Astrophysics Data System (ADS)
Tuck-Martin, Amy; Adam, Jürgen; Eagles, Graeme
2015-04-01
Starting with the break up of Gondwana, the northwest Indian Ocean and its continental margins in Madagascar, East Africa and western India formed by divergence of the African and Indian plates and were shaped by a complicated sequence of plate boundary relocations, ridge propagation events, and the independent movement of the Seychelles microplate. As a result, attempts to reconcile the different plate-tectonic components and processes into a coherent kinematic model have so far been unsatisfactory. A new high-resolution plate kinematic model has been produced in an attempt to solve these problems, using seafloor spreading data and rotation parameters generated by a mixture of visual fitting of magnetic isochron data and iterative joint inversion of magnetic isochron and fracture zone data. Using plate motion vectors and plate boundary geometries derived from this model, the first-order regional stress pattern was modelled for distinct phases of margin formation. The stress pattern is correlated with the tectono-stratigraphic history of related sedimentary basins. The plate kinematic model identifies three phases of spreading, from the Jurassic to the Paleogene, which resulted in the formation of three main oceanic basins. Prior to these phases, intracontinental 'Karoo' rifting episodes in the late Carboniferous to late Triassic had failed to break up Gondwana, but initiated the formation of sedimentary basins along the East African and West Madagascan margins. At the start of the first phase of spreading (183 to 133 Ma) predominantly NW - SE extension caused continental rifting that separated Madagascar/India/Antarctica from Africa. Maximum horizontal stresses trended perpendicular to the local plate-kinematic vector, and parallel to the rift axes. During and after continental break-up and subsequent spreading, the regional stress regime changed drastically. The extensional stress regime became restricted to the active spreading ridges that in turn adopted trends normal to the plate divergence vector. Away from the active ridges, compressional horizontal stresses caused by ridge-push forces were transmitted through the subsiding oceanic lithosphere, with an SH max orientation parallel to plate divergence vectors. These changes are documented by the lower Bajocian continental breakup unconformity, which can be traced throughout East African basins. At 133 Ma, the plate boundary moved from north to south of Madagascar, incorporating it into the African plate and initiating its separation from Antarctica. The orientation of the plate divergence vector however did not change markedly. The second phase (89 - 61 Ma) led to the separation of India from Madagascar, initiating a new and dramatic change in stress orientation from N-S to ENE-WSW. This led to renewed tectonic activity in the sedimentary basins of western Madagascar. In the third phase (61 Ma to present) asymmetric spreading of the Carlsberg Ridge separated India from the Seychelles and the Mascarene Plateau via the southward propagation of the Carlsberg Ridge to form the Central Indian Ridge. The anti-clockwise rotation of the independent Seychelles microplate between chrons 28n (64.13 Ma) and 26n (58.38 Ma) and the opening of the short-lived Laxmi Basin (67 Ma to abandonment within chron 28n (64.13 - 63.10 Ma)) have been further constrained by the new plate kinematic model. Along the East African margin, SH max remained in a NE - SW orientation and the sedimentary basins experienced continued thick, deep water sediment deposition. Contemporaneously, in the sedimentary basins along East African passive margin, ridge-push related maximum horizontal stresses became progressively outweighed by local gravity-driven NE-SW maximum horizontal stresses trending parallel to the margin. These stress regimes are caused by sediment loading and extensional collapse of thick sediment wedges, predominantly controlled by margin geometry. Our study successfully integrates an interpretation of paleo-stress regimes constrained by the new high resolution plate kinematic and basin history to produce a margin scale tectono-stratigraphic framework that highlights the important interplay of plate boundary forces and basin formation events along the East African margin.
Identification of lethal cluster of genes in the yeast transcription network
NASA Astrophysics Data System (ADS)
Rho, K.; Jeong, H.; Kahng, B.
2006-05-01
Identification of essential or lethal genes would be one of the ultimate goals in drug designs. Here we introduce an in silico method to select the cluster with a high population of lethal genes, called lethal cluster, through microarray assay. We construct a gene transcription network based on the microarray expression level. Links are added one by one in the descending order of the Pearson correlation coefficients between two genes. As the link density p increases, two meaningful link densities pm and ps are observed. At pm, which is smaller than the percolation threshold, the number of disconnected clusters is maximum, and the lethal genes are highly concentrated in a certain cluster that needs to be identified. Thus the deletion of all genes in that cluster could efficiently lead to a lethal inviable mutant. This lethal cluster can be identified by an in silico method. As p increases further beyond the percolation threshold, the power law behavior in the degree distribution of a giant cluster appears at ps. We measure the degree of each gene at ps. With the information pertaining to the degrees of each gene at ps, we return to the point pm and calculate the mean degree of genes of each cluster. We find that the lethal cluster has the largest mean degree.
Ranjbar, Reza; Behzadi, Payam; Najafi, Ali; Roudi, Raheleh
2017-01-01
A rapid, accurate, flexible and reliable diagnostic method may significantly decrease the costs of diagnosis and treatment. Designing an appropriate microarray chip reduces noises and probable biases in the final result. The aim of this study was to design and construct a DNA Microarray Chip for a rapid detection and identification of 10 important bacterial agents. In the present survey, 10 unique genomic regions relating to 10 pathogenic bacterial agents including Escherichia coli (E.coli), Shigella boydii, Sh.dysenteriae, Sh.flexneri, Sh.sonnei, Salmonella typhi, S.typhimurium, Brucella sp., Legionella pneumophila, and Vibrio cholera were selected for designing specific long oligo microarray probes. For this reason, the in-silico operations including utilization of the NCBI RefSeq database, Servers of PanSeq and Gview, AlleleID 7.7 and Oligo Analyzer 3.1 was done. On the other hand, the in-vitro part of the study comprised stages of robotic microarray chip probe spotting, bacterial DNAs extraction and DNA labeling, hybridization and microarray chip scanning. In wet lab section, different tools and apparatus such as Nexterion® Slide E, Qarray mini spotter, NimbleGen kit, TrayMix TM S4, and Innoscan 710 were used. A DNA microarray chip including 10 long oligo microarray probes was designed and constructed for detection and identification of 10 pathogenic bacteria. The DNA microarray chip was capable to identify all 10 bacterial agents tested simultaneously. The presence of a professional bioinformatician as a probe designer is needed to design appropriate multifunctional microarray probes to increase the accuracy of the outcomes.
Richard, Arianne C; Lyons, Paul A; Peters, James E; Biasci, Daniele; Flint, Shaun M; Lee, James C; McKinney, Eoin F; Siegel, Richard M; Smith, Kenneth G C
2014-08-04
Although numerous investigations have compared gene expression microarray platforms, preprocessing methods and batch correction algorithms using constructed spike-in or dilution datasets, there remains a paucity of studies examining the properties of microarray data using diverse biological samples. Most microarray experiments seek to identify subtle differences between samples with variable background noise, a scenario poorly represented by constructed datasets. Thus, microarray users lack important information regarding the complexities introduced in real-world experimental settings. The recent development of a multiplexed, digital technology for nucleic acid measurement enables counting of individual RNA molecules without amplification and, for the first time, permits such a study. Using a set of human leukocyte subset RNA samples, we compared previously acquired microarray expression values with RNA molecule counts determined by the nCounter Analysis System (NanoString Technologies) in selected genes. We found that gene measurements across samples correlated well between the two platforms, particularly for high-variance genes, while genes deemed unexpressed by the nCounter generally had both low expression and low variance on the microarray. Confirming previous findings from spike-in and dilution datasets, this "gold-standard" comparison demonstrated signal compression that varied dramatically by expression level and, to a lesser extent, by dataset. Most importantly, examination of three different cell types revealed that noise levels differed across tissues. Microarray measurements generally correlate with relative RNA molecule counts within optimal ranges but suffer from expression-dependent accuracy bias and precision that varies across datasets. We urge microarray users to consider expression-level effects in signal interpretation and to evaluate noise properties in each dataset independently.
A non-stationary cost-benefit based bivariate extreme flood estimation approach
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo
2018-02-01
Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.
Vertical mercury distributions in the oceans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gill, G.A.; Fitzgerald, W.F.
1988-06-01
The vertical distribution of mercury (Hg) was determined at coastal and open ocean sites in the northwest Atlantic and Pacific Oceans. Reliable and diagnostic Hg distribution were obtained, permitting major processes governing the marine biogeochemistry of Hg to be identified. The northwest Atlantic near Bermuda showed surface water Hg concentrations near 4 pM, a maximum of 10 pM within the main thermocline, and concentrations less than or equal to surface water values below the depth of the maximum. The maximum appears to result from lateral transport of Hg enriched waters from higher latitudes. In the central North Pacific, surface watersmore » (to 940 m) were slightly elevated (1.9 {plus minus} 0.7 pM) compared to deeper waters (1.4 {plus minus} 0.4 pM), but on thermocline Hg maximum was observed. At similar depths, Hg concentrations near Bermuda were elevated compared to the central North Pacific Ocean. The authors hypothesize that the source of this Hg comes from diagenetic reactions in oxic margin sediments, releasing dissolved Hg to overlying water. Geochemical steady-state box modeling arguments predict a relatively short ({approximately}350 years) mean residence time for Hg in the oceans, demonstrating the reactive nature of Hg in seawater and precluding significant involvement in nutrient-type recycling. Mercury's distributional features and reactive nature suggest that interaction of Hg with settling particulate matter and margin sediments play important roles in regulating oceanic Hg concentrations. Oceanic Hg distributions are governed by an external cycling process, in which water column distributions reflect a rapid competition between the magnitude of the input source and the intensity of the (water column) removal process.« less
Gong, Wei; He, Kun; Covington, Mike; Dinesh-Kumar, S. P.; Snyder, Michael; Harmer, Stacey L.; Zhu, Yu-Xian; Deng, Xing Wang
2009-01-01
We used our collection of Arabidopsis transcription factor (TF) ORFeome clones to construct protein microarrays containing as many as 802 TF proteins. These protein microarrays were used for both protein-DNA and protein-protein interaction analyses. For protein-DNA interaction studies, we examined AP2/ERF family TFs and their cognate cis-elements. By careful comparison of the DNA-binding specificity of 13 TFs on the protein microarray with previous non-microarray data, we showed that protein microarrays provide an efficient and high throughput tool for genome-wide analysis of TF-DNA interactions. This microarray protein-DNA interaction analysis allowed us to derive a comprehensive view of DNA-binding profiles of AP2/ERF family proteins in Arabidopsis. It also revealed four TFs that bound the EE (evening element) and had the expected phased gene expression under clock-regulation, thus providing a basis for further functional analysis of their roles in clock regulation of gene expression. We also developed procedures for detecting protein interactions using this TF protein microarray and discovered four novel partners that interact with HY5, which can be validated by yeast two-hybrid assays. Thus, plant TF protein microarrays offer an attractive high-throughput alternative to traditional techniques for TF functional characterization on a global scale. PMID:19802365
Zhao, Zhengshan; Peytavi, Régis; Diaz-Quijada, Gerardo A.; Picard, Francois J.; Huletsky, Ann; Leblanc, Éric; Frenette, Johanne; Boivin, Guy; Veres, Teodor; Dumoulin, Michel M.; Bergeron, Michel G.
2008-01-01
Fabrication of microarray devices using traditional glass slides is not easily adaptable to integration into microfluidic systems. There is thus a need for the development of polymeric materials showing a high hybridization signal-to-background ratio, enabling sensitive detection of microbial pathogens. We have developed such plastic supports suitable for highly sensitive DNA microarray hybridizations. The proof of concept of this microarray technology was done through the detection of four human respiratory viruses that were amplified and labeled with a fluorescent dye via a sensitive reverse transcriptase PCR (RT-PCR) assay. The performance of the microarray hybridization with plastic supports made of PMMA [poly(methylmethacrylate)]-VSUVT or Zeonor 1060R was compared to that with high-quality glass slide microarrays by using both passive and microfluidic hybridization systems. Specific hybridization signal-to-background ratios comparable to that obtained with high-quality commercial glass slides were achieved with both polymeric substrates. Microarray hybridizations demonstrated an analytical sensitivity equivalent to approximately 100 viral genome copies per RT-PCR, which is at least 100-fold higher than the sensitivities of previously reported DNA hybridizations on plastic supports. Testing of these plastic polymers using a microfluidic microarray hybridization platform also showed results that were comparable to those with glass supports. In conclusion, PMMA-VSUVT and Zeonor 1060R are both suitable for highly sensitive microarray hybridizations. PMID:18784318
Development and application of a microarray meter tool to optimize microarray experiments
Rouse, Richard JD; Field, Katrine; Lapira, Jennifer; Lee, Allen; Wick, Ivan; Eckhardt, Colleen; Bhasker, C Ramana; Soverchia, Laura; Hardiman, Gary
2008-01-01
Background Successful microarray experimentation requires a complex interplay between the slide chemistry, the printing pins, the nucleic acid probes and targets, and the hybridization milieu. Optimization of these parameters and a careful evaluation of emerging slide chemistries are a prerequisite to any large scale array fabrication effort. We have developed a 'microarray meter' tool which assesses the inherent variations associated with microarray measurement prior to embarking on large scale projects. Findings The microarray meter consists of nucleic acid targets (reference and dynamic range control) and probe components. Different plate designs containing identical probe material were formulated to accommodate different robotic and pin designs. We examined the variability in probe quality and quantity (as judged by the amount of DNA printed and remaining post-hybridization) using three robots equipped with capillary printing pins. Discussion The generation of microarray data with minimal variation requires consistent quality control of the (DNA microarray) manufacturing and experimental processes. Spot reproducibility is a measure primarily of the variations associated with printing. The microarray meter assesses array quality by measuring the DNA content for every feature. It provides a post-hybridization analysis of array quality by scoring probe performance using three metrics, a) a measure of variability in the signal intensities, b) a measure of the signal dynamic range and c) a measure of variability of the spot morphologies. PMID:18710498
Burgarella, Sarah; Cattaneo, Dario; Masseroli, Marco
2006-01-01
We developed MicroGen, a multi-database Web based system for managing all the information characterizing spotted microarray experiments. It supports information gathering and storing according to the Minimum Information About Microarray Experiments (MIAME) standard. It also allows easy sharing of information and data among all multidisciplinary actors involved in spotted microarray experiments. PMID:17238488
2006-04-27
polysaccharide microarray platform was prepared by immobilizing Burkholderia pseudomallei and Burkholderia mallei polysaccharides . This... polysaccharide array was tested with success for detecting B. pseudomallei and B. mallei serum (human and animal) antibodies. The advantages of this microarray... Polysaccharide microarrays; Burkholderia pseudomallei; Burkholderia mallei; Glanders; Melioidosis1. Introduction There has been a great deal of emphasis on the
Microarray-integrated optoelectrofluidic immunoassay system
Han, Dongsik
2016-01-01
A microarray-based analytical platform has been utilized as a powerful tool in biological assay fields. However, an analyte depletion problem due to the slow mass transport based on molecular diffusion causes low reaction efficiency, resulting in a limitation for practical applications. This paper presents a novel method to improve the efficiency of microarray-based immunoassay via an optically induced electrokinetic phenomenon by integrating an optoelectrofluidic device with a conventional glass slide-based microarray format. A sample droplet was loaded between the microarray slide and the optoelectrofluidic device on which a photoconductive layer was deposited. Under the application of an AC voltage, optically induced AC electroosmotic flows caused by a microarray-patterned light actively enhanced the mass transport of target molecules at the multiple assay spots of the microarray simultaneously, which reduced tedious reaction time from more than 30 min to 10 min. Based on this enhancing effect, a heterogeneous immunoassay with a tiny volume of sample (5 μl) was successfully performed in the microarray-integrated optoelectrofluidic system using immunoglobulin G (IgG) and anti-IgG, resulting in improved efficiency compared to the static environment. Furthermore, the application of multiplex assays was also demonstrated by multiple protein detection. PMID:27190571
Microarray-integrated optoelectrofluidic immunoassay system.
Han, Dongsik; Park, Je-Kyun
2016-05-01
A microarray-based analytical platform has been utilized as a powerful tool in biological assay fields. However, an analyte depletion problem due to the slow mass transport based on molecular diffusion causes low reaction efficiency, resulting in a limitation for practical applications. This paper presents a novel method to improve the efficiency of microarray-based immunoassay via an optically induced electrokinetic phenomenon by integrating an optoelectrofluidic device with a conventional glass slide-based microarray format. A sample droplet was loaded between the microarray slide and the optoelectrofluidic device on which a photoconductive layer was deposited. Under the application of an AC voltage, optically induced AC electroosmotic flows caused by a microarray-patterned light actively enhanced the mass transport of target molecules at the multiple assay spots of the microarray simultaneously, which reduced tedious reaction time from more than 30 min to 10 min. Based on this enhancing effect, a heterogeneous immunoassay with a tiny volume of sample (5 μl) was successfully performed in the microarray-integrated optoelectrofluidic system using immunoglobulin G (IgG) and anti-IgG, resulting in improved efficiency compared to the static environment. Furthermore, the application of multiplex assays was also demonstrated by multiple protein detection.
Advances in cell-free protein array methods.
Yu, Xiaobo; Petritis, Brianne; Duan, Hu; Xu, Danke; LaBaer, Joshua
2018-01-01
Cell-free protein microarrays represent a special form of protein microarray which display proteins made fresh at the time of the experiment, avoiding storage and denaturation. They have been used increasingly in basic and translational research over the past decade to study protein-protein interactions, the pathogen-host relationship, post-translational modifications, and antibody biomarkers of different human diseases. Their role in the first blood-based diagnostic test for early stage breast cancer highlights their value in managing human health. Cell-free protein microarrays will continue to evolve to become widespread tools for research and clinical management. Areas covered: We review the advantages and disadvantages of different cell-free protein arrays, with an emphasis on the methods that have been studied in the last five years. We also discuss the applications of each microarray method. Expert commentary: Given the growing roles and impact of cell-free protein microarrays in research and medicine, we discuss: 1) the current technical and practical limitations of cell-free protein microarrays; 2) the biomarker discovery and verification pipeline using protein microarrays; and 3) how cell-free protein microarrays will advance over the next five years, both in their technology and applications.
Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho
2017-01-01
Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.
NASA Astrophysics Data System (ADS)
Slamet, Isnandar; Mardiana Putri Carissa, Siska; Pratiwi, Hasih
2017-10-01
Investors always seek an efficient portfolio which is a portfolio that has a maximum return on specific risk or minimal risk on specific return. Almost marginal conditional stochastic dominance (AMCSD) criteria can be used to form the efficient portfolio. The aim of this research is to apply the AMCSD criteria to form an efficient portfolio of bank shares listed in the LQ-45. This criteria is used when there are areas that do not meet the criteria of marginal conditional stochastic dominance (MCSD). On the other words, this criteria can be derived from quotient of areas that violate the MCSD criteria with the area that violate and not violate the MCSD criteria. Based on the data bank stocks listed on LQ-45, it can be stated that there are 38 efficient portfolios of 420 portfolios where each portfolio comprises of 4 stocks and 315 efficient portfolios of 1710 portfolios with each of portfolio has 3 stocks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nie, K; Yue, N; Chen, T
2014-06-15
Purpose: In lung radiation treatment, PTV is formed with a margin around GTV (or CTV/ITV). Although GTV is most likely of water equivalent density, the PTV margin may be formed with the surrounding low-density tissues, which may lead to unreal dosimetric plan. This study is to evaluate whether the concern of dose calculation inside the PTV with only low density margin could be justified in lung treatment. Methods: Three SBRT cases were analyzed. The PTV from the original plan (Plan-O) was created with a 5–10 mm margin outside the ITV to incorporate setup errors and all mobility from 10 respiratorymore » phases. Test plans were generated with the GTV shifted to the PTV edge to simulate the extreme situations with maximum setup uncertainties. Two representative positions as the very posterior-superior (Plan-PS) and anterior-inferior (Plan-AI) edge were considered. The virtual GTV was assigned a density of 1.0 g.cm−3 and surrounding lung, including the PTV margin, was defined as 0.25 g.cm−3. Also, additional plan with a 1mm tissue-margin instead of full lung-margin was created to evaluate whether a composite-margin (Plan-Comp) has a better approximation for dose calculation. All plans were generated on the average CT using Analytical Anisotropic Algorithm with heterogeneity correction on and all planning parameters/monitor unites remained unchanged. DVH analyses were performed for comparisons. Results: Despite the non-static dose distribution, the high-dose region synchronized with tumor positions. This might due to scatter conditions as greater doses were absorbed in the solid-tumor than in the surrounding low-density lungtissue. However, it still showed missing target coverage in general. Certain level of composite-margin might give better approximation for the dosecalculation. Conclusion: Our exploratory results suggest that with the lungmargin only, the planning dose of PTV might overestimate the coverage of the target during treatment. The significance of this overestimation might warrant further investigation.« less
Interim report on updated microarray probes for the LLNL Burkholderia pseudomallei SNP array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, S; Jaing, C
2012-03-27
The overall goal of this project is to forensically characterize 100 unknown Burkholderia isolates in the US-Australia collaboration. We will identify genome-wide single nucleotide polymorphisms (SNPs) from B. pseudomallei and near neighbor species including B. mallei, B. thailandensis and B. oklahomensis. We will design microarray probes to detect these SNP markers and analyze 100 Burkholderia genomic DNAs extracted from environmental, clinical and near neighbor isolates from Australian collaborators on the Burkholderia SNP microarray. We will analyze the microarray genotyping results to characterize the genetic diversity of these new isolates and triage the samples for whole genome sequencing. In this interimmore » report, we described the SNP analysis and the microarray probe design for the Burkholderia SNP microarray.« less
2010-01-01
Background The development of DNA microarrays has facilitated the generation of hundreds of thousands of transcriptomic datasets. The use of a common reference microarray design allows existing transcriptomic data to be readily compared and re-analysed in the light of new data, and the combination of this design with large datasets is ideal for 'systems'-level analyses. One issue is that these datasets are typically collected over many years and may be heterogeneous in nature, containing different microarray file formats and gene array layouts, dye-swaps, and showing varying scales of log2- ratios of expression between microarrays. Excellent software exists for the normalisation and analysis of microarray data but many data have yet to be analysed as existing methods struggle with heterogeneous datasets; options include normalising microarrays on an individual or experimental group basis. Our solution was to develop the Batch Anti-Banana Algorithm in R (BABAR) algorithm and software package which uses cyclic loess to normalise across the complete dataset. We have already used BABAR to analyse the function of Salmonella genes involved in the process of infection of mammalian cells. Results The only input required by BABAR is unprocessed GenePix or BlueFuse microarray data files. BABAR provides a combination of 'within' and 'between' microarray normalisation steps and diagnostic boxplots. When applied to a real heterogeneous dataset, BABAR normalised the dataset to produce a comparable scaling between the microarrays, with the microarray data in excellent agreement with RT-PCR analysis. When applied to a real non-heterogeneous dataset and a simulated dataset, BABAR's performance in identifying differentially expressed genes showed some benefits over standard techniques. Conclusions BABAR is an easy-to-use software tool, simplifying the simultaneous normalisation of heterogeneous two-colour common reference design cDNA microarray-based transcriptomic datasets. We show BABAR transforms real and simulated datasets to allow for the correct interpretation of these data, and is the ideal tool to facilitate the identification of differentially expressed genes or network inference analysis from transcriptomic datasets. PMID:20128918
A genome-wide 20 K citrus microarray for gene expression analysis
Martinez-Godoy, M Angeles; Mauri, Nuria; Juarez, Jose; Marques, M Carmen; Santiago, Julia; Forment, Javier; Gadea, Jose
2008-01-01
Background Understanding of genetic elements that contribute to key aspects of citrus biology will impact future improvements in this economically important crop. Global gene expression analysis demands microarray platforms with a high genome coverage. In the last years, genome-wide EST collections have been generated in citrus, opening the possibility to create new tools for functional genomics in this crop plant. Results We have designed and constructed a publicly available genome-wide cDNA microarray that include 21,081 putative unigenes of citrus. As a functional companion to the microarray, a web-browsable database [1] was created and populated with information about the unigenes represented in the microarray, including cDNA libraries, isolated clones, raw and processed nucleotide and protein sequences, and results of all the structural and functional annotation of the unigenes, like general description, BLAST hits, putative Arabidopsis orthologs, microsatellites, putative SNPs, GO classification and PFAM domains. We have performed a Gene Ontology comparison with the full set of Arabidopsis proteins to estimate the genome coverage of the microarray. We have also performed microarray hybridizations to check its usability. Conclusion This new cDNA microarray replaces the first 7K microarray generated two years ago and allows gene expression analysis at a more global scale. We have followed a rational design to minimize cross-hybridization while maintaining its utility for different citrus species. Furthermore, we also provide access to a website with full structural and functional annotation of the unigenes represented in the microarray, along with the ability to use this site to directly perform gene expression analysis using standard tools at different publicly available servers. Furthermore, we show how this microarray offers a good representation of the citrus genome and present the usefulness of this genomic tool for global studies in citrus by using it to catalogue genes expressed in citrus globular embryos. PMID:18598343
An evaluation of two-channel ChIP-on-chip and DNA methylation microarray normalization strategies
2012-01-01
Background The combination of chromatin immunoprecipitation with two-channel microarray technology enables genome-wide mapping of binding sites of DNA-interacting proteins (ChIP-on-chip) or sites with methylated CpG di-nucleotides (DNA methylation microarray). These powerful tools are the gateway to understanding gene transcription regulation. Since the goals of such studies, the sample preparation procedures, the microarray content and study design are all different from transcriptomics microarrays, the data pre-processing strategies traditionally applied to transcriptomics microarrays may not be appropriate. Particularly, the main challenge of the normalization of "regulation microarrays" is (i) to make the data of individual microarrays quantitatively comparable and (ii) to keep the signals of the enriched probes, representing DNA sequences from the precipitate, as distinguishable as possible from the signals of the un-enriched probes, representing DNA sequences largely absent from the precipitate. Results We compare several widely used normalization approaches (VSN, LOWESS, quantile, T-quantile, Tukey's biweight scaling, Peng's method) applied to a selection of regulation microarray datasets, ranging from DNA methylation to transcription factor binding and histone modification studies. Through comparison of the data distributions of control probes and gene promoter probes before and after normalization, and assessment of the power to identify known enriched genomic regions after normalization, we demonstrate that there are clear differences in performance between normalization procedures. Conclusion T-quantile normalization applied separately on the channels and Tukey's biweight scaling outperform other methods in terms of the conservation of enriched and un-enriched signal separation, as well as in identification of genomic regions known to be enriched. T-quantile normalization is preferable as it additionally improves comparability between microarrays. In contrast, popular normalization approaches like quantile, LOWESS, Peng's method and VSN normalization alter the data distributions of regulation microarrays to such an extent that using these approaches will impact the reliability of the downstream analysis substantially. PMID:22276688
Rai, Muhammad Farooq; Tycksen, Eric D; Sandell, Linda J; Brophy, Robert H
2018-01-01
Microarrays and RNA-seq are at the forefront of high throughput transcriptome analyses. Since these methodologies are based on different principles, there are concerns about the concordance of data between the two techniques. The concordance of RNA-seq and microarrays for genome-wide analysis of differential gene expression has not been rigorously assessed in clinically derived ligament tissues. To demonstrate the concordance between RNA-seq and microarrays and to assess potential benefits of RNA-seq over microarrays, we assessed differences in transcript expression in anterior cruciate ligament (ACL) tissues based on time-from-injury. ACL remnants were collected from patients with an ACL tear at the time of ACL reconstruction. RNA prepared from torn ACL remnants was subjected to Agilent microarrays (N = 24) and RNA-seq (N = 8). The correlation of biological replicates in RNA-seq and microarrays data was similar (0.98 vs. 0.97), demonstrating that each platform has high internal reproducibility. Correlations between the RNA-seq data and the individual microarrays were low, but correlations between the RNA-seq values and the geometric mean of the microarrays values were moderate. The cross-platform concordance for differentially expressed transcripts or enriched pathways was linearly correlated (r = 0.64). RNA-Seq was superior in detecting low abundance transcripts and differentiating biologically critical isoforms. Additional independent validation of transcript expression was undertaken using microfluidic PCR for selected genes. PCR data showed 100% concordance (in expression pattern) with RNA-seq and microarrays data. These findings demonstrate that RNA-seq has advantages over microarrays for transcriptome profiling of ligament tissues when available and affordable. Furthermore, these findings are likely transferable to other musculoskeletal tissues where tissue collection is challenging and cells are in low abundance. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:484-497, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
The effect of column purification on cDNA indirect labelling for microarrays
Molas, M Lia; Kiss, John Z
2007-01-01
Background The success of the microarray reproducibility is dependent upon the performance of standardized procedures. Since the introduction of microarray technology for the analysis of global gene expression, reproducibility of results among different laboratories has been a major problem. Two of the main contributors to this variability are the use of different microarray platforms and different laboratory practices. In this paper, we address the latter question in terms of how variation in one of the steps of a labelling procedure affects the cDNA product prior to microarray hybridization. Results We used a standard procedure to label cDNA for microarray hybridization and employed different types of column chromatography for cDNA purification. After purifying labelled cDNA, we used the Agilent 2100 Bioanalyzer and agarose gel electrophoresis to assess the quality of the labelled cDNA before its hybridization onto a microarray platform. There were major differences in the cDNA profile (i.e. cDNA fragment lengths and abundance) as a result of using four different columns for purification. In addition, different columns have different efficiencies to remove rRNA contamination. This study indicates that the appropriate column to use in this type of protocol has to be experimentally determined. Finally, we present new evidence establishing the importance of testing the method of purification used during an indirect labelling procedure. Our results confirm the importance of assessing the quality of the sample in the labelling procedure prior to hybridization onto a microarray platform. Conclusion Standardization of column purification systems to be used in labelling procedures will improve the reproducibility of microarray results among different laboratories. In addition, implementation of a quality control check point of the labelled samples prior to microarray hybridization will prevent hybridizing a poor quality sample to expensive micorarrays. PMID:17597522
The effect of column purification on cDNA indirect labelling for microarrays.
Molas, M Lia; Kiss, John Z
2007-06-27
The success of the microarray reproducibility is dependent upon the performance of standardized procedures. Since the introduction of microarray technology for the analysis of global gene expression, reproducibility of results among different laboratories has been a major problem. Two of the main contributors to this variability are the use of different microarray platforms and different laboratory practices. In this paper, we address the latter question in terms of how variation in one of the steps of a labelling procedure affects the cDNA product prior to microarray hybridization. We used a standard procedure to label cDNA for microarray hybridization and employed different types of column chromatography for cDNA purification. After purifying labelled cDNA, we used the Agilent 2100 Bioanalyzer and agarose gel electrophoresis to assess the quality of the labelled cDNA before its hybridization onto a microarray platform. There were major differences in the cDNA profile (i.e. cDNA fragment lengths and abundance) as a result of using four different columns for purification. In addition, different columns have different efficiencies to remove rRNA contamination. This study indicates that the appropriate column to use in this type of protocol has to be experimentally determined. Finally, we present new evidence establishing the importance of testing the method of purification used during an indirect labelling procedure. Our results confirm the importance of assessing the quality of the sample in the labelling procedure prior to hybridization onto a microarray platform. Standardization of column purification systems to be used in labelling procedures will improve the reproducibility of microarray results among different laboratories. In addition, implementation of a quality control check point of the labelled samples prior to microarray hybridization will prevent hybridizing a poor quality sample to expensive micorarrays.
McCoy, Gary R; Touzet, Nicolas; Fleming, Gerard T A; Raine, Robin
2015-07-01
The toxic microalgal species Prymnesium parvum and Prymnesium polylepis are responsible for numerous fish kills causing economic stress on the aquaculture industry and, through the consumption of contaminated shellfish, can potentially impact on human health. Monitoring of toxic phytoplankton is traditionally carried out by light microscopy. However, molecular methods of identification and quantification are becoming more common place. This study documents the optimisation of the novel Microarrays for the Detection of Toxic Algae (MIDTAL) microarray from its initial stages to the final commercial version now available from Microbia Environnement (France). Existing oligonucleotide probes used in whole-cell fluorescent in situ hybridisation (FISH) for Prymnesium species from higher group probes to species-level probes were adapted and tested on the first-generation microarray. The combination and interaction of numerous other probes specific for a whole range of phytoplankton taxa also spotted on the chip surface caused high cross reactivity, resulting in false-positive results on the microarray. The probe sequences were extended for the subsequent second-generation microarray, and further adaptations of the hybridisation protocol and incubation temperatures significantly reduced false-positive readings from the first to the second-generation chip, thereby increasing the specificity of the MIDTAL microarray. Additional refinement of the subsequent third-generation microarray protocols with the addition of a poly-T amino linker to the 5' end of each probe further enhanced the microarray performance but also highlighted the importance of optimising RNA labelling efficiency when testing with natural seawater samples from Killary Harbour, Ireland.
The Glycan Microarray Story from Construction to Applications.
Hyun, Ji Young; Pai, Jaeyoung; Shin, Injae
2017-04-18
Not only are glycan-mediated binding processes in cells and organisms essential for a wide range of physiological processes, but they are also implicated in various pathological processes. As a result, elucidation of glycan-associated biomolecular interactions and their consequences is of great importance in basic biological research and biomedical applications. In 2002, we and others were the first to utilize glycan microarrays in efforts aimed at the rapid analysis of glycan-associated recognition events. Because they contain a number of glycans immobilized in a dense and orderly manner on a solid surface, glycan microarrays enable multiple parallel analyses of glycan-protein binding events while utilizing only small amounts of glycan samples. Therefore, this microarray technology has become a leading edge tool in studies aimed at elucidating roles played by glycans and glycan binding proteins in biological systems. In this Account, we summarize our efforts on the construction of glycan microarrays and their applications in studies of glycan-associated interactions. Immobilization strategies of functionalized and unmodified glycans on derivatized glass surfaces are described. Although others have developed immobilization techniques, our efforts have focused on improving the efficiencies and operational simplicity of microarray construction. The microarray-based technology has been most extensively used for rapid analysis of the glycan binding properties of proteins. In addition, glycan microarrays have been employed to determine glycan-protein interactions quantitatively, detect pathogens, and rapidly assess substrate specificities of carbohydrate-processing enzymes. More recently, the microarrays have been employed to identify functional glycans that elicit cell surface lectin-mediated cellular responses. Owing to these efforts, it is now possible to use glycan microarrays to expand the understanding of roles played by glycans and glycan binding proteins in biological systems.
NASA Astrophysics Data System (ADS)
Lu, Q.; Amelung, F.; Wdowinski, S.
2017-12-01
The Greenland ice sheet is rapidly shrinking with the fastest retreat and thinning occurring at the ice sheet margin and near the outlet glaciers. The changes of the ice mass cause an elastic response of the bedrock. Theoretically, ice mass loss during the summer melting season is associated with bedrock uplift, whereas increasing ice mass during the winter months is associated with bedrock subsidence. Here we examine the annual changes of the vertical displacements measured at 37 GPS stations and compare the results with Greenland drainage basins' gravity from GRACE. We use both Fourier Series (FS) analysis and Cubic Smoothing Spline (CSS) method to estimate the phases and amplitudes of seasonal variations. Both methods show significant differences seasonal behaviors in southern and northern Greenland. The average amplitude of bedrock displacements (3.29±0.02mm) in south Greenland is about 2 times larger than the north (1.65±0.02mm). The phase of bedrock maximum uplift (November) is considerably consistent with the time of minimum ice mass load in south Greenland (October). However, the phase of bedrock maximum uplift in north Greenland (February) is 4 months later than the minimum ice mass load in north Greenland basins (October). In addition, we present ground deformation near several famous glaciers in Greenland such as Petermann glacier and Jakobshavn glacier. We process InSAR data from TerraSAR-X and Sentinel satellite, based on small baseline interferograms. We observed rapid deglaciation-induced uplift and seasonal variations on naked bedrock near the glacier ice margin.
Qualitative computer aided evaluation of dental impressions in vivo.
Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H
2006-01-01
Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.
Two-Dimensional VO2 Mesoporous Microarrays for High-Performance Supercapacitor
NASA Astrophysics Data System (ADS)
Fan, Yuqi; Ouyang, Delong; Li, Bao-Wen; Dang, Feng; Ren, Zongming
2018-05-01
Two-dimensional (2D) mesoporous VO2 microarrays have been prepared using an organic-inorganic liquid interface. The units of microarrays consist of needle-like VO2 particles with a mesoporous structure, in which crack-like pores with a pore size of about 2 nm and depth of 20-100 nm are distributed on the particle surface. The liquid interface acts as a template for the formation of the 2D microarrays, as identified from the kinetic observation. Due to the mesoporous structure of the units and high conductivity of the microarray, such 2D VO2 microarrays exhibit a high specific capacitance of 265 F/g at 1 A/g and excellent rate capability (182 F/g at 10 A/g) and cycling stability, suggesting the effect of unique microstructure for improving the electrochemical performance.
Plant-pathogen interactions: what microarray tells about it?
Lodha, T D; Basak, J
2012-01-01
Plant defense responses are mediated by elementary regulatory proteins that affect expression of thousands of genes. Over the last decade, microarray technology has played a key role in deciphering the underlying networks of gene regulation in plants that lead to a wide variety of defence responses. Microarray is an important tool to quantify and profile the expression of thousands of genes simultaneously, with two main aims: (1) gene discovery and (2) global expression profiling. Several microarray technologies are currently in use; most include a glass slide platform with spotted cDNA or oligonucleotides. Till date, microarray technology has been used in the identification of regulatory genes, end-point defence genes, to understand the signal transduction processes underlying disease resistance and its intimate links to other physiological pathways. Microarray technology can be used for in-depth, simultaneous profiling of host/pathogen genes as the disease progresses from infection to resistance/susceptibility at different developmental stages of the host, which can be done in different environments, for clearer understanding of the processes involved. A thorough knowledge of plant disease resistance using successful combination of microarray and other high throughput techniques, as well as biochemical, genetic, and cell biological experiments is needed for practical application to secure and stabilize yield of many crop plants. This review starts with a brief introduction to microarray technology, followed by the basics of plant-pathogen interaction, the use of DNA microarrays over the last decade to unravel the mysteries of plant-pathogen interaction, and ends with the future prospects of this technology.
A geostatistical extreme-value framework for fast simulation of natural hazard events
Stephenson, David B.
2016-01-01
We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768
Lead nitrate induced unallied expression of liver and kidney functions in male albino rats.
Chougule, Priti; Patil, Bhagyashree; Kanase, Aruna
2005-06-01
To determine the effects of lead where lead accumulates maximum (liver followed by kidney), liver and kidney functions were studied using low oral dose of lead nitrate for prolonged duration. Dose of 20 mg lead nitrate/kg body wt/day was used in male albino rats. AST and ALT levels altered independently. When ALT remained unaltered after 7 and 21 days of treatment, it is decreased by 13.21% after 14 days treatment. AST was marginally lowered after 7 days, increased after 14 days and increased marginally after 21 days. Bilirubin (conjugated, unconjugated and total) decreased after 7 and 14 days and increased after 21 days. Urea increase was directly proportional to duration. Creatinine remained unaltered.
Clustering-based spot segmentation of cDNA microarray images.
Uslan, Volkan; Bucak, Ihsan Ömür
2010-01-01
Microarrays are utilized as that they provide useful information about thousands of gene expressions simultaneously. In this study segmentation step of microarray image processing has been implemented. Clustering-based methods, fuzzy c-means and k-means, have been applied for the segmentation step that separates the spots from the background. The experiments show that fuzzy c-means have segmented spots of the microarray image more accurately than the k-means.
ERIC Educational Resources Information Center
Yang, Ji Seung; Cai, Li
2014-01-01
The main purpose of this study is to improve estimation efficiency in obtaining maximum marginal likelihood estimates of contextual effects in the framework of nonlinear multilevel latent variable model by adopting the Metropolis-Hastings Robbins-Monro algorithm (MH-RM). Results indicate that the MH-RM algorithm can produce estimates and standard…
ERIC Educational Resources Information Center
Wu, Yi-Fang
2015-01-01
Item response theory (IRT) uses a family of statistical models for estimating stable characteristics of items and examinees and defining how these characteristics interact in describing item and test performance. With a focus on the three-parameter logistic IRT (Birnbaum, 1968; Lord, 1980) model, the current study examines the accuracy and…
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2013-01-01
In Ramsay curve item response theory (RC-IRT, Woods & Thissen, 2006) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's (1981) EM algorithm, which yields maximum marginal likelihood estimates. This method, however,…
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2014-01-01
In Ramsay curve item response theory (RC-IRT) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's EM algorithm, which yields maximum marginal likelihood estimates. This method, however, does not produce the…
A perspective on microarrays: current applications, pitfalls, and potential uses
Jaluria, Pratik; Konstantopoulos, Konstantinos; Betenbaugh, Michael; Shiloach, Joseph
2007-01-01
With advances in robotics, computational capabilities, and the fabrication of high quality glass slides coinciding with increased genomic information being available on public databases, microarray technology is increasingly being used in laboratories around the world. In fact, fields as varied as: toxicology, evolutionary biology, drug development and production, disease characterization, diagnostics development, cellular physiology and stress responses, and forensics have benefiting from its use. However, for many researchers not familiar with microarrays, current articles and reviews often address neither the fundamental principles behind the technology nor the proper designing of experiments. Although, microarray technology is relatively simple, conceptually, its practice does require careful planning and detailed understanding of the limitations inherently present. Without these considerations, it can be exceedingly difficult to ascertain valuable information from microarray data. Therefore, this text aims to outline key features in microarray technology, paying particular attention to current applications as outlined in recent publications, experimental design, statistical methods, and potential uses. Furthermore, this review is not meant to be comprehensive, but rather substantive; highlighting important concepts and detailing steps necessary to conduct and interpret microarray experiments. Collectively, the information included in this text will highlight the versatility of microarray technology and provide a glimpse of what the future may hold. PMID:17254338
A Platform for Combined DNA and Protein Microarrays Based on Total Internal Reflection Fluorescence
Asanov, Alexander; Zepeda, Angélica; Vaca, Luis
2012-01-01
We have developed a novel microarray technology based on total internal reflection fluorescence (TIRF) in combination with DNA and protein bioassays immobilized at the TIRF surface. Unlike conventional microarrays that exhibit reduced signal-to-background ratio, require several stages of incubation, rinsing and stringency control, and measure only end-point results, our TIRF microarray technology provides several orders of magnitude better signal-to-background ratio, performs analysis rapidly in one step, and measures the entire course of association and dissociation kinetics between target DNA and protein molecules and the bioassays. In many practical cases detection of only DNA or protein markers alone does not provide the necessary accuracy for diagnosing a disease or detecting a pathogen. Here we describe TIRF microarrays that detect DNA and protein markers simultaneously, which reduces the probabilities of false responses. Supersensitive and multiplexed TIRF DNA and protein microarray technology may provide a platform for accurate diagnosis or enhanced research studies. Our TIRF microarray system can be mounted on upright or inverted microscopes or interfaced directly with CCD cameras equipped with a single objective, facilitating the development of portable devices. As proof-of-concept we applied TIRF microarrays for detecting molecular markers from Bacillus anthracis, the pathogen responsible for anthrax. PMID:22438738
Motakis, E S; Nason, G P; Fryzlewicz, P; Rutter, G A
2006-10-15
Many standard statistical techniques are effective on data that are normally distributed with constant variance. Microarray data typically violate these assumptions since they come from non-Gaussian distributions with a non-trivial mean-variance relationship. Several methods have been proposed that transform microarray data to stabilize variance and draw its distribution towards the Gaussian. Some methods, such as log or generalized log, rely on an underlying model for the data. Others, such as the spread-versus-level plot, do not. We propose an alternative data-driven multiscale approach, called the Data-Driven Haar-Fisz for microarrays (DDHFm) with replicates. DDHFm has the advantage of being 'distribution-free' in the sense that no parametric model for the underlying microarray data is required to be specified or estimated; hence, DDHFm can be applied very generally, not just to microarray data. DDHFm achieves very good variance stabilization of microarray data with replicates and produces transformed intensities that are approximately normally distributed. Simulation studies show that it performs better than other existing methods. Application of DDHFm to real one-color cDNA data validates these results. The R package of the Data-Driven Haar-Fisz transform (DDHFm) for microarrays is available in Bioconductor and CRAN.
Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset
2012-01-01
Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO). MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA) with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO) correctly identified (p < 0.05) microarray data in which genes annotated to differentially expressed GO terms are upregulated. We found that GSEA + MIMGO was slightly less effective than, or comparable to, GSEA (Pearson), a method that uses Pearson’s correlation as a metric, at detecting true differentially expressed GO terms. However, unlike other methods including GSEA (Pearson), GSEA + MIMGO can comprehensively identify the microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively. PMID:23232071
Microintaglio Printing for Soft Lithography-Based in Situ Microarrays
Biyani, Manish; Ichiki, Takanori
2015-01-01
Advances in lithographic approaches to fabricating bio-microarrays have been extensively explored over the last two decades. However, the need for pattern flexibility, a high density, a high resolution, affordability and on-demand fabrication is promoting the development of unconventional routes for microarray fabrication. This review highlights the development and uses of a new molecular lithography approach, called “microintaglio printing technology”, for large-scale bio-microarray fabrication using a microreactor array (µRA)-based chip consisting of uniformly-arranged, femtoliter-size µRA molds. In this method, a single-molecule-amplified DNA microarray pattern is self-assembled onto a µRA mold and subsequently converted into a messenger RNA or protein microarray pattern by simultaneously producing and transferring (immobilizing) a messenger RNA or a protein from a µRA mold to a glass surface. Microintaglio printing allows the self-assembly and patterning of in situ-synthesized biomolecules into high-density (kilo-giga-density), ordered arrays on a chip surface with µm-order precision. This holistic aim, which is difficult to achieve using conventional printing and microarray approaches, is expected to revolutionize and reshape proteomics. This review is not written comprehensively, but rather substantively, highlighting the versatility of microintaglio printing for developing a prerequisite platform for microarray technology for the postgenomic era. PMID:27600226
Element Load Data Processor (ELDAP) Users Manual
NASA Technical Reports Server (NTRS)
Ramsey, John K., Jr.; Ramsey, John K., Sr.
2015-01-01
Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.
JointMMCC: Joint Maximum-Margin Classification and Clustering of Imaging Data
Filipovych, Roman; Resnick, Susan M.; Davatzikos, Christos
2012-01-01
A number of conditions are characterized by pathologies that form continuous or nearly-continuous spectra spanning from the absence of pathology to very pronounced pathological changes (e.g., normal aging, Mild Cognitive Impairment, Alzheimer's). Moreover, diseases are often highly heterogeneous with a number of diagnostic subcategories or subconditions lying within the spectra (e.g., Autism Spectrum Disorder, schizophrenia). Discovering coherent subpopulations of subjects within the spectrum of pathological changes may further our understanding of diseases, and potentially identify subconditions that require alternative or modified treatment options. In this paper, we propose an approach that aims at identifying coherent subpopulations with respect to the underlying MRI in the scenario where the condition is heterogeneous and pathological changes form a continuous spectrum. We describe a Joint Maximum-Margin Classification and Clustering (JointMMCC) approach that jointly detects the pathologic population via semi-supervised classification, as well as disentangles heterogeneity of the pathological cohort by solving a clustering subproblem. We propose an efficient solution to the non-convex optimization problem associated with JointMMCC. We apply our proposed approach to an MRI study of aging, and identify coherent subpopulations (i.e., clusters) of cognitively less stable adults. PMID:22328179
An Introduction to MAMA (Meta-Analysis of MicroArray data) System.
Zhang, Zhe; Fenstermacher, David
2005-01-01
Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.
Methods to study legionella transcriptome in vitro and in vivo.
Faucher, Sebastien P; Shuman, Howard A
2013-01-01
The study of transcriptome responses can provide insight into the regulatory pathways and genetic factors that contribute to a specific phenotype. For bacterial pathogens, it can identify putative new virulence systems and shed light on the mechanisms underlying the regulation of virulence factors. Microarrays have been previously used to study gene regulation in Legionella pneumophila. In the past few years a sharp reduction of the costs associated with microarray experiments together with the availability of relatively inexpensive custom-designed commercial microarrays has made microarray technology an accessible tool for the majority of researchers. Here we describe the methodologies to conduct microarray experiments from in vitro and in vivo samples.
GeneXplorer: an interactive web application for microarray data visualization and analysis.
Rees, Christian A; Demeter, Janos; Matese, John C; Botstein, David; Sherlock, Gavin
2004-10-01
When publishing large-scale microarray datasets, it is of great value to create supplemental websites where either the full data, or selected subsets corresponding to figures within the paper, can be browsed. We set out to create a CGI application containing many of the features of some of the existing standalone software for the visualization of clustered microarray data. We present GeneXplorer, a web application for interactive microarray data visualization and analysis in a web environment. GeneXplorer allows users to browse a microarray dataset in an intuitive fashion. It provides simple access to microarray data over the Internet and uses only HTML and JavaScript to display graphic and annotation information. It provides radar and zoom views of the data, allows display of the nearest neighbors to a gene expression vector based on their Pearson correlations and provides the ability to search gene annotation fields. The software is released under the permissive MIT Open Source license, and the complete documentation and the entire source code are freely available for download from CPAN http://search.cpan.org/dist/Microarray-GeneXplorer/.
Fluorescence-based bioassays for the detection and evaluation of food materials.
Nishi, Kentaro; Isobe, Shin-Ichiro; Zhu, Yun; Kiyama, Ryoiti
2015-10-13
We summarize here the recent progress in fluorescence-based bioassays for the detection and evaluation of food materials by focusing on fluorescent dyes used in bioassays and applications of these assays for food safety, quality and efficacy. Fluorescent dyes have been used in various bioassays, such as biosensing, cell assay, energy transfer-based assay, probing, protein/immunological assay and microarray/biochip assay. Among the arrays used in microarray/biochip assay, fluorescence-based microarrays/biochips, such as antibody/protein microarrays, bead/suspension arrays, capillary/sensor arrays, DNA microarrays/polymerase chain reaction (PCR)-based arrays, glycan/lectin arrays, immunoassay/enzyme-linked immunosorbent assay (ELISA)-based arrays, microfluidic chips and tissue arrays, have been developed and used for the assessment of allergy/poisoning/toxicity, contamination and efficacy/mechanism, and quality control/safety. DNA microarray assays have been used widely for food safety and quality as well as searches for active components. DNA microarray-based gene expression profiling may be useful for such purposes due to its advantages in the evaluation of pathway-based intracellular signaling in response to food materials.
Fluorescence-Based Bioassays for the Detection and Evaluation of Food Materials
Nishi, Kentaro; Isobe, Shin-Ichiro; Zhu, Yun; Kiyama, Ryoiti
2015-01-01
We summarize here the recent progress in fluorescence-based bioassays for the detection and evaluation of food materials by focusing on fluorescent dyes used in bioassays and applications of these assays for food safety, quality and efficacy. Fluorescent dyes have been used in various bioassays, such as biosensing, cell assay, energy transfer-based assay, probing, protein/immunological assay and microarray/biochip assay. Among the arrays used in microarray/biochip assay, fluorescence-based microarrays/biochips, such as antibody/protein microarrays, bead/suspension arrays, capillary/sensor arrays, DNA microarrays/polymerase chain reaction (PCR)-based arrays, glycan/lectin arrays, immunoassay/enzyme-linked immunosorbent assay (ELISA)-based arrays, microfluidic chips and tissue arrays, have been developed and used for the assessment of allergy/poisoning/toxicity, contamination and efficacy/mechanism, and quality control/safety. DNA microarray assays have been used widely for food safety and quality as well as searches for active components. DNA microarray-based gene expression profiling may be useful for such purposes due to its advantages in the evaluation of pathway-based intracellular signaling in response to food materials. PMID:26473869
Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas
2016-09-19
Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.
Microfluidic microarray systems and methods thereof
West, Jay A. A. [Castro Valley, CA; Hukari, Kyle W [San Ramon, CA; Hux, Gary A [Tracy, CA
2009-04-28
Disclosed are systems that include a manifold in fluid communication with a microfluidic chip having a microarray, an illuminator, and a detector in optical communication with the microarray. Methods for using these systems for biological detection are also disclosed.
cDNA Microarray Screening in Food Safety
ROY, SASHWATI; SEN, CHANDAN K
2009-01-01
The cDNA microarray technology and related bioinformatics tools presents a wide range of novel application opportunities. The technology may be productively applied to address food safety. In this mini-review article, we present an update highlighting the late breaking discoveries that demonstrate the vitality of cDNA microarray technology as a tool to analyze food safety with reference to microbial pathogens and genetically modified foods. In order to bring the microarray technology to mainstream food safety, it is important to develop robust user-friendly tools that may be applied in a field setting. In addition, there needs to be a standardized process for regulatory agencies to interpret and act upon microarray-based data. The cDNA microarray approach is an emergent technology in diagnostics. Its values lie in being able to provide complimentary molecular insight when employed in addition to traditional tests for food safety, as part of a more comprehensive battery of tests. PMID:16466843
Li, Zhiguang; Kwekel, Joshua C; Chen, Tao
2012-01-01
Functional comparison across microarray platforms is used to assess the comparability or similarity of the biological relevance associated with the gene expression data generated by multiple microarray platforms. Comparisons at the functional level are very important considering that the ultimate purpose of microarray technology is to determine the biological meaning behind the gene expression changes under a specific condition, not just to generate a list of genes. Herein, we present a method named percentage of overlapping functions (POF) and illustrate how it is used to perform the functional comparison of microarray data generated across multiple platforms. This method facilitates the determination of functional differences or similarities in microarray data generated from multiple array platforms across all the functions that are presented on these platforms. This method can also be used to compare the functional differences or similarities between experiments, projects, or laboratories.
Emerging Use of Gene Expression Microarrays in Plant Physiology
Wullschleger, Stan D.; Difazio, Stephen P.
2003-01-01
Microarrays have become an important technology for the global analysis of gene expression in humans, animals, plants, and microbes. Implemented in the context of a well-designed experiment, cDNA and oligonucleotide arrays can provide highthroughput, simultaneous analysis of transcript abundance for hundreds, if not thousands, of genes. However, despite widespread acceptance, the use of microarrays as a tool to better understand processes of interest to the plant physiologist is still being explored. To help illustrate current uses of microarrays in the plant sciences, several case studies that we believe demonstrate the emerging application of gene expression arrays in plant physiology weremore » selected from among the many posters and presentations at the 2003 Plant and Animal Genome XI Conference. Based on this survey, microarrays are being used to assess gene expression in plants exposed to the experimental manipulation of air temperature, soil water content and aluminium concentration in the root zone. Analysis often includes characterizing transcript profiles for multiple post-treatment sampling periods and categorizing genes with common patterns of response using hierarchical clustering techniques. In addition, microarrays are also providing insights into developmental changes in gene expression associated with fibre and root elongation in cotton and maize, respectively. Technical and analytical limitations of microarrays are discussed and projects attempting to advance areas of microarray design and data analysis are highlighted. Finally, although much work remains, we conclude that microarrays are a valuable tool for the plant physiologist interested in the characterization and identification of individual genes and gene families with potential application in the fields of agriculture, horticulture and forestry.« less
Profiling In Situ Microbial Community Structure with an Amplification Microarray
Knickerbocker, Christopher; Bryant, Lexi; Golova, Julia; Wiles, Cory; Williams, Kenneth H.; Peacock, Aaron D.; Long, Philip E.
2013-01-01
The objectives of this study were to unify amplification, labeling, and microarray hybridization chemistries within a single, closed microfluidic chamber (an amplification microarray) and verify technology performance on a series of groundwater samples from an in situ field experiment designed to compare U(VI) mobility under conditions of various alkalinities (as HCO3−) during stimulated microbial activity accompanying acetate amendment. Analytical limits of detection were between 2 and 200 cell equivalents of purified DNA. Amplification microarray signatures were well correlated with 16S rRNA-targeted quantitative PCR results and hybridization microarray signatures. The succession of the microbial community was evident with and consistent between the two microarray platforms. Amplification microarray analysis of acetate-treated groundwater showed elevated levels of iron-reducing bacteria (Flexibacter, Geobacter, Rhodoferax, and Shewanella) relative to the average background profile, as expected. Identical molecular signatures were evident in the transect treated with acetate plus NaHCO3, but at much lower signal intensities and with a much more rapid decline (to nondetection). Azoarcus, Thaurea, and Methylobacterium were responsive in the acetate-only transect but not in the presence of bicarbonate. Observed differences in microbial community composition or response to bicarbonate amendment likely had an effect on measured rates of U reduction, with higher rates probable in the part of the field experiment that was amended with bicarbonate. The simplification in microarray-based work flow is a significant technological advance toward entirely closed-amplicon microarray-based tests and is generally extensible to any number of environmental monitoring applications. PMID:23160129
Arora, Sheen Juneja; Arora, Aman; Upadhyaya, Viram; Jain, Shilpi
2016-01-01
As, the longevity of provisional restorations is related to, a perfect adaptation and a strong, long-term union between restoration and teeth structures, therefore, evaluation of marginal leakage of provisional restorative materials luted with cements using the standardized procedures is essential. To compare the marginal leakage of the provisional crowns fabricated from Autopolymerizing acrylic resin crowns and bisphenol A-glycidyl dimethacrylate (BIS-GMA) resin crowns. To compare the marginal leakage of the provisional crowns fabricated from autopolymerizing acrylic resin crowns and BIS-GMA resin crowns cemented with different temporary luting cements. To compare the marginal leakage of the provisional crowns fabricated from autopolymerizing acrylic resin (SC-10) crowns cemented with different temporary luting cements. To compare the marginal leakage of the provisional crowns fabricated from BIS-GMA resin crowns (Protemp 4) cemented with different temporary luting cements. Freshly extracted 60 maxillary premolars of approximately similar dimensions were mounted in dental plaster. Tooth reduction with shoulder margin was planned to use a customized handpiece-holding jig. Provisional crowns were prepared using the wax pattern fabricated from computer aided designing/computer aided manufacturing milling machine following the tooth preparation. Sixty provisional crowns were made, thirty each of SC-10 and Protemp 4 and were then cemented with three different luting cements. Specimens were thermocycled, submerged in a 2% methylene blue solution, then sectioned and observed under a stereomicroscope for the evaluation of marginal microleakage. A five-level scale was used to score dye penetration in the tooth/cement interface and the results of this study was analyzed using the Chi-square test, Mann-Whitney U-test, Kruskal-Wallis H-test and the results were statistically significant P < 0.05 the power of study - 80%. Marginal leakage was significant in both provisional crowns cemented with three different luting cements along the axial walls of teeth (P < 0.05) confidence interval - 95%. The temporary cements with eugenol showed more microleakage than those without eugenol. SC-10 crowns showed more microleakage compared to Protemp 4 crowns. SC-10 crowns cemented with Kalzinol showed maximum microleakage and Protemp 4 crowns cemented with HY bond showed least microleakage.
PRACTICAL STRATEGIES FOR PROCESSING AND ANALYZING SPOTTED OLIGONUCLEOTIDE MICROARRAY DATA
Thoughtful data analysis is as important as experimental design, biological sample quality, and appropriate experimental procedures for making microarrays a useful supplement to traditional toxicology. In the present study, spotted oligonucleotide microarrays were used to profile...
DNA Microarray-based Ecotoxicological Biomarker Discovery in a Small Fish Model Species
This paper addresses several issues critical to use of zebrafish oligonucleotide microarrays for computational toxicology research on endocrine disrupting chemicals using small fish models, and more generally, the use of microarrays in aquatic toxicology.
IMPROVING THE RELIABILITY OF MICROARRAYS FOR TOXICOLOGY RESEARCH: A COLLABORATIVE APPROACH
Microarray-based gene expression profiling is a critical tool to identify molecular biomarkers of specific chemical stressors. Although current microarray technologies have progressed from their infancy, biological and technical repeatability and reliability are often still limit...
Organochlorine pesticide residues in ground water of Thiruvallur district, India.
Jayashree, R; Vasudevan, N
2007-05-01
Modern agriculture practices reveal an increase in use of pesticides and fertilizers to meet the food demand of increasing population which results in contamination of the environment. In India crop production increased to 100% but the cropping area has increased marginally by 20%. Pesticides have played a major role in achieving the maximum crop production, but maximum usage and accumulation of pesticide residues was highly detrimental to aquatic and other ecosystem. The present study was chosen to know the level of organochlorines contamination in ground water of Thiruvallur district, Tamil Nadu, India. The samples were highly contaminated with DDT, HCH, endosulfan and their derivatives. Among the HCH derivatives, Gamma HCH residues was found maximum of 9.8 microg/l in Arumbakkam open wells. Concentrations of pp-DDT and op-DDT were 14.3 microg/l and 0.8 microg/l. The maximum residue (15.9 microg/l) of endosulfan sulfate was recorded in Kandigai village bore well. The study showed that the ground water samples were highly contaminated with organochlorine residues.
Statistical use of argonaute expression and RISC assembly in microRNA target identification.
Stanhope, Stephen A; Sengupta, Srikumar; den Boon, Johan; Ahlquist, Paul; Newton, Michael A
2009-09-01
MicroRNAs (miRNAs) posttranscriptionally regulate targeted messenger RNAs (mRNAs) by inducing cleavage or otherwise repressing their translation. We address the problem of detecting m/miRNA targeting relationships in homo sapiens from microarray data by developing statistical models that are motivated by the biological mechanisms used by miRNAs. The focus of our modeling is the construction, activity, and mediation of RNA-induced silencing complexes (RISCs) competent for targeted mRNA cleavage. We demonstrate that regression models accommodating RISC abundance and controlling for other mediating factors fit the expression profiles of known target pairs substantially better than models based on m/miRNA expressions alone, and lead to verifications of computational target pair predictions that are more sensitive than those based on marginal expression levels. Because our models are fully independent of exogenous results from sequence-based computational methods, they are appropriate for use as either a primary or secondary source of information regarding m/miRNA target pair relationships, especially in conjunction with high-throughput expression studies.
Integrative prescreening in analysis of multiple cancer genomic studies
2012-01-01
Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431
Direct labeling of serum proteins by fluorescent dye for antibody microarray.
Klimushina, M V; Gumanova, N G; Metelskaya, V A
2017-05-06
Analysis of serum proteome by antibody microarray is used to identify novel biomarkers and to study signaling pathways including protein phosphorylation and protein-protein interactions. Labeling of serum proteins is important for optimal performance of the antibody microarray. Proper choice of fluorescent label and optimal concentration of protein loaded on the microarray ensure good quality of imaging that can be reliably scanned and processed by the software. We have optimized direct serum protein labeling using fluorescent dye Arrayit Green 540 (Arrayit Corporation, USA) for antibody microarray. Optimized procedure produces high quality images that can be readily scanned and used for statistical analysis of protein composition of the serum. Copyright © 2017 Elsevier Inc. All rights reserved.
Transfection microarray and the applications.
Miyake, Masato; Yoshikawa, Tomohiro; Fujita, Satoshi; Miyake, Jun
2009-05-01
Microarray transfection has been extensively studied for high-throughput functional analysis of mammalian cells. However, control of efficiency and reproducibility are the critical issues for practical use. By using solid-phase transfection accelerators and nano-scaffold, we provide a highly efficient and reproducible microarray-transfection device, "transfection microarray". The device would be applied to the limited number of available primary cells and stem cells not only for large-scale functional analysis but also reporter-based time-lapse cellular event analysis.
Zivicova, Veronika; Gal, Peter; Mifkova, Alzbeta; Novak, Stepan; Kaltner, Herbert; Kolar, Michal; Strnad, Hynek; Sachova, Jana; Hradilova, Miluse; Chovanec, Martin; Gabius, Hans-Joachim; Smetana, Karel; Fik, Zdenek
2018-03-01
Having previously initiated genome-wide expression profiling in head and neck squamous cell carcinoma (HNSCC) for regions of the tumor, the margin of surgical resecate (MSR) and normal mucosa (NM), we here proceed with respective analysis of cases after stratification according to the expression status of tenascin (Ten). Tissue specimens of each anatomical site were analyzed by immunofluorescent detection of Ten, fibronectin (Fn) and galectin-1 (Gal-1) as well as by microarrays. Histopathological examination demonstrated that Ten + Fn + Gal-1 + co-expression occurs more frequently in samples of HNSCC (55%) than in NM (9%; p<0.01). Contrary, the Ten - Fn + Gal-1 - (45%) and Ten - Fn - Gal-1 - (39%) status occurred with significantly (p<0.01) higher frequency than in HNSCC (3% and 4%, respectively). In MSRs, different immunophenotypes were distributed rather equally (Ten + Fn + Gal-1 + =24%; Ten - Fn + Gal-1 - =36%; Ten - Fn - Gal-1 - =33%), differing to the results in tumors (p<0.05). Absence/presence of Ten was used for stratification of patients into cohorts without a difference in prognosis, to comparatively examine gene-activity signatures. Microarray analysis revealed i) expression of several tumor progression-associated genes in Ten + HNSCC tumors and ii) a strong up-regulation of gene expression assigned to lipid metabolism in MSRs of Ten - tumors, while NM profiles remained similar. The presented data reveal marked and specific changes in tumors and MSR specimens of HNSCC without a separation based on prognosis. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, Brandon R; Graw, Michael; Brodie, Eoin L
2013-11-01
The biogeochemical processes that occur in marine sediments on continental margins are complex; however, from one perspective they can be considered with respect to three geochemical zones based on the presence and form of methane: sulfate–methane transition (SMTZ), gas hydrate stability zone (GHSZ), and free gas zone (FGZ). These geochemical zones may harbor distinct microbial communities that are important in biogeochemical carbon cycles. The objective of this study was to describe the microbial communities in sediments from the SMTZ, GHSZ, and FGZ using molecular ecology methods (i.e. PhyloChip microarray analysis and terminal restriction fragment length polymorphism (T-RFLP)) and examining themore » results in the context of non-biological parameters in the sediments. Non-metric multidimensional scaling and multi-response permutation procedures were used to determine whether microbial community compositions were significantly different in the three geochemical zones and to correlate samples with abiotic characteristics of the sediments. This analysis indicated that microbial communities from all three zones were distinct from one another and that variables such as sulfate concentration, hydrate saturation of the nearest gas hydrate layer, and depth (or unmeasured variables associated with depth e.g. temperature, pressure) were correlated to differences between the three zones. The archaeal anaerobic methanotrophs typically attributed to performing anaerobic oxidation of methane were not detected in the SMTZ; however, the marine benthic group-B, which is often found in SMTZ, was detected. Within the GHSZ, samples that were typically closer to layers that contained higher hydrate saturation had indicator sequences related to Vibrio-type taxa. These results suggest that the biogeographic patterns of microbial communities in marine sediments are distinct based on geochemical zones defined by methane.« less
Cho, Sung Yoon; Ki, Chang-Seok; Jang, Ja-Hyun; Sohn, Young Bae; Park, Sung Won; Kim, Se Hwa; Kim, Su Jin; Jin, Dong-Kyu
2012-06-01
Patients with Xp deletions have short stature and may have some somatic traits typical of Turner syndrome (TS), whereas gonadal function is generally preserved. In most studies of these patients, microsatellites have been used to determine the break point of the Xp deletion. In the present study, we describe the clinical, cytogenetic, and chromosomal microarray (CMA) analysis of a family with an Xp22.33-Xp22.12 deletion. Two female siblings, aged 8 years 9 months and 11 years 10 months, presented with short stature. The older sibling's height (index case) was 137.9 cm (-1.81 SDS) and the younger sibling's height was 118.6 cm (-2.13 SDS). The mother and both daughters had only a short stature; a skeletal survey showed normal findings except for mildly shortened 4th and 5th metacarpal bones. No features of TS were present. The deletion appeared terminal with a breakpoint within Xp22.2 located about 19.9 Mb from the Xp telomere. The deletion contained 102 protein-coding genes. A probe of the end breakage point was located at the 19,908,986th base of the X chromosome, and a probe of the marginal normal region near the breakage point was located at the 19,910,848th base of the X chromosome. Therefore, the breakage point was concluded to be located between these two probes. In summary, we report a familial case of an Xp deletion. The findings of our study may be helpful in further analyzing the phenotypes associated with Xp deletions. Copyright © 2012 Wiley Periodicals, Inc.
A Human Lectin Microarray for Sperm Surface Glycosylation Analysis *
Sun, Yangyang; Cheng, Li; Gu, Yihua; Xin, Aijie; Wu, Bin; Zhou, Shumin; Guo, Shujuan; Liu, Yin; Diao, Hua; Shi, Huijuan; Wang, Guangyu; Tao, Sheng-ce
2016-01-01
Glycosylation is one of the most abundant and functionally important protein post-translational modifications. As such, technology for efficient glycosylation analysis is in high demand. Lectin microarrays are a powerful tool for such investigations and have been successfully applied for a variety of glycobiological studies. However, most of the current lectin microarrays are primarily constructed from plant lectins, which are not well suited for studies of human glycosylation because of the extreme complexity of human glycans. Herein, we constructed a human lectin microarray with 60 human lectin and lectin-like proteins. All of the lectins and lectin-like proteins were purified from yeast, and most showed binding to human glycans. To demonstrate the applicability of the human lectin microarray, human sperm were probed on the microarray and strong bindings were observed for several lectins, including galectin-1, 7, 8, GalNAc-T6, and ERGIC-53 (LMAN1). These bindings were validated by flow cytometry and fluorescence immunostaining. Further, mass spectrometry analysis showed that galectin-1 binds several membrane-associated proteins including heat shock protein 90. Finally, functional assays showed that binding of galectin-8 could significantly enhance the acrosome reaction within human sperms. To our knowledge, this is the first construction of a human lectin microarray, and we anticipate it will find wide use for a range of human or mammalian studies, alone or in combination with plant lectin microarrays. PMID:27364157
Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA
2008-01-01
Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776
Biofuels, land, and water: a systems approach to sustainability.
Gopalakrishnan, Gayathri; Negri, M Cristina; Wang, Michael; Wu, May; Snyder, Seth W; Lafreniere, Lorraine
2009-08-01
There is a strong societal need to evaluate and understand the sustainability of biofuels, especially because of the significant increases in production mandated by many countries, including the United States. Sustainability will be a strong factor in the regulatory environment and investments in biofuels. Biomass feedstock production is an important contributor to environmental, social, and economic impacts from biofuels. This study presents a systems approach where the agricultural, energy, and environmental sectors are considered as components of a single system, and environmental liabilities are used as recoverable resources for biomass feedstock production. We focus on efficient use of land and water resources. We conducted a spatial analysis evaluating marginal land and degraded water resources to improve feedstock productivity with concomitant environmental restoration for the state of Nebraska. Results indicate that utilizing marginal land resources such as riparian and roadway buffer strips, brownfield sites, and marginal agricultural land could produce enough feedstocks to meet a maximum of 22% of the energy requirements of the state compared to the current supply of 2%. Degraded water resources such as nitrate-contaminated groundwater and wastewater were evaluated as sources of nutrients and water to improve feedstock productivity. Spatial overlap between degraded water and marginal land resources was found to be as high as 96% and could maintain sustainable feedstock production on marginal lands. Other benefits of implementing this strategy include feedstock intensification to decrease biomass transportation costs, restoration of contaminated water resources, and mitigation of greenhouse gas emissions.
Schwager, Silke S; Leiter, Ulrike; Buettner, Petra G; Voit, Christiane; Marsch, Wolfgang; Gutzmer, Ralf; Näher, Helmut; Gollnick, Harald; Bröcker, Eva Bettina; Garbe, Claus
2008-04-01
This study analysed the changes of excision margins in correlation with tumour thickness as recorded over the last three decades in Germany. The study also evaluated surgical management in different geographical regions and treatment options for metastasized melanoma. A total of 42 625 patients with invasive primary cutaneous melanoma, recorded by the German Central Malignant Melanoma Registry between 1976 and 2005 were included. Multiple linear regression analysis was used to investigate time trends of excision margins adjusted for tumour thickness. Excision margins of 5.0 cm were widely used in the late 1970s but since then have been replaced by smaller margins that are dependent on tumour thickness. In the case of primary melanoma, one-step surgery dominated until 1985 and was mostly replaced by two-step excisions since the early 1990s. In eastern Germany, one-step management remained common until the late 1990s. During the last three decades loco-regional metastases were predominantly treated by surgery (up to 80%), whereas systemic therapy decreased. The primary treatment of distant metastases has consistently been systemic chemotherapy. This descriptive retrospective study revealed a significant decrease in excision margins to a maximum of 2.00 cm. A significant trend towards two-step excisions in primary cutaneous melanoma was observed throughout Germany. Management of metastasized melanoma showed a tendency towards surgical procedures in limited disease and an ongoing trend to systemic treatment in advanced disease.
Wang, Wei; Griswold, Michael E
2016-11-30
The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL
Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
Development of a DNA microarray for species identification of quarantine aphids.
Lee, Won Sun; Choi, Hwalran; Kang, Jinseok; Kim, Ji-Hoon; Lee, Si Hyeock; Lee, Seunghwan; Hwang, Seung Yong
2013-12-01
Aphid pests are being brought into Korea as a result of increased crop trading. Aphids exist on growth areas of plants, and thus plant growth is seriously affected by aphid pests. However, aphids are very small and have several sexual morphs and life stages, so it is difficult to identify species on the basis of morphological features. This problem was approached using DNA microarray technology. DNA targets of the cytochrome c oxidase subunit I gene were generated with a fluorescent dye-labelled primer and were hybridised onto a DNA microarray consisting of specific probes. After analysing the signal intensity of the specific probes, the unique patterns from the DNA microarray, consisting of 47 species-specific probes, were obtained to identify 23 aphid species. To confirm the accuracy of the developed DNA microarray, ten individual blind samples were used in blind trials, and the identifications were completely consistent with the sequencing data of all individual blind samples. A microarray has been developed to distinguish aphid species. DNA microarray technology provides a rapid, easy, cost-effective and accurate method for identifying aphid species for pest control management. © 2013 Society of Chemical Industry.
Evaluating concentration estimation errors in ELISA microarray experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daly, Don S.; White, Amanda M.; Varnum, Susan M.
Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Althoughmore » propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.« less
NASA Technical Reports Server (NTRS)
Harris, C. E.; Jelalian, A. V.
1979-01-01
Analyses of the mounting and mount support systems of the clear air turbulence transmitters verify that satisfactory shock and vibration isolation are attained. The mount support structure conforms to flight crash safety requirements with high margins of safety. Restraint cables reinforce the mounts in the critical loaded forward direction limiting maximum forward system deflection to 1 1/4 inches.
A re-appraisal of the stratigraphy and volcanology of the Cerro Galán volcanic system, NW Argentina
Folkes, Christopher B.; Wright, Heather M.; Cas, Ray A.F.; de Silva, Shanaka L.; Lesti, Chiara; Viramonte, Jose G.
2011-01-01
From detailed fieldwork and biotite 40Ar/39Ar dating correlated with paleomagnetic analyses of lithic clasts, we present a revision of the stratigraphy, areal extent and volume estimates of ignimbrites in the Cerro Galán volcanic complex. We find evidence for nine distinct outflow ignimbrites, including two newly identified ignimbrites in the Toconquis Group (the Pitas and Vega Ignimbrites). Toconquis Group Ignimbrites (~5.60–4.51 Ma biotite ages) have been discovered to the southwest and north of the caldera, increasing their spatial extents from previous estimates. Previously thought to be contemporaneous, we distinguish the Real Grande Ignimbrite (4.68 ± 0.07 Ma biotite age) from the Cueva Negra Ignimbrite (3.77 ± 0.08 Ma biotite age). The form and collapse processes of the Cerro Galán caldera are also reassessed. Based on re-interpretation of the margins of the caldera, we find evidence for a fault-bounded trapdoor collapse hinged along a regional N-S fault on the eastern side of the caldera and accommodated on a N-S fault on the western caldera margin. The collapsed area defines a roughly isosceles trapezoid shape elongated E-W and with maximum dimensions 27 × 16 km. The Cerro Galán Ignimbrite (CGI; 2.08 ± 0.02 Ma sanidine age) outflow sheet extends to 40 km in all directions from the inferred structural margins, with a maximum runout distance of ~80 km to the north of the caldera. New deposit volume estimates confirm an increase in eruptive volume through time, wherein the Toconquis Group Ignimbrites increase in volume from the ~10 km3 Lower Merihuaca Ignimbrite to a maximum of ~390 km3 (Dense Rock Equivalent; DRE) with the Real Grande Ignimbrite. The climactic CGI has a revised volume of ~630 km3 (DRE), approximately two thirds of the commonly quoted value.
Essays in the California electricity reserves markets
NASA Astrophysics Data System (ADS)
Metaxoglou, Konstantinos
This dissertation examines inefficiencies in the California electricity reserves markets. In Chapter 1, I use the information released during the investigation of the state's electricity crisis of 2000 and 2001 by the Federal Energy Regulatory Commission to diagnose allocative inefficiencies. Building upon the work of Wolak (2000), I calculate a lower bound for the sellers' price-cost margins using the inverse elasticities of their residual demand curves. The downward bias in my estimates stems from the fact that I don't account for the hierarchical substitutability of the reserve types. The margins averaged at least 20 percent for the two highest quality types of reserves, regulation and spinning, generating millions of dollars in transfers to a handful of sellers. I provide evidence that the deviations from marginal cost pricing were due to the markets' high concentration and a principal-agent relationship that emerged from their design. In Chapter 2, I document systematic differences between the markets' day- and hour-ahead prices. I use a high-dimensional vector moving average model to estimate the premia and conduct correct inferences. To obtain exact maximum likelihood estimates of the model, I employ the EM algorithm that I develop in Chapter 3. I uncover significant day-ahead premia, which I attribute to market design characteristics too. On the demand side, the market design established a principal-agent relationship between the markets' buyers (principal) and their supervisory authority (agent). The agent had very limited incentives to shift reserve purchases to the lower priced hour-ahead markets. On the supply side, the market design raised substantial entry barriers by precluding purely speculative trading and by introducing a complicated code of conduct that induced uncertainty about which actions were subject to regulatory scrutiny. In Chapter 3, I introduce a state-space representation for vector autoregressive moving average models that enables exact maximum likelihood estimation using the EM algorithm. Moreover, my algorithm uses only analytical expressions; it requires the Kalman filter and a fixed-interval smoother in the E step and least squares-type regression in the M step. In contrast, existing maximum likelihood estimation methods require numerical differentiation, both for univariate and multivariate models.
Denny, M W; Dowd, W W
2012-03-15
As the air temperature of the Earth rises, ecological relationships within a community might shift, in part due to differences in the thermal physiology of species. Prediction of these shifts - an urgent task for ecologists - will be complicated if thermal tolerance itself can rapidly evolve. Here, we employ a mechanistic approach to predict the potential for rapid evolution of thermal tolerance in the intertidal limpet Lottia gigantea. Using biophysical principles to predict body temperature as a function of the state of the environment, and an environmental bootstrap procedure to predict how the environment fluctuates through time, we create hypothetical time-series of limpet body temperatures, which are in turn used as a test platform for a mechanistic evolutionary model of thermal tolerance. Our simulations suggest that environmentally driven stochastic variation of L. gigantea body temperature results in rapid evolution of a substantial 'safety margin': the average lethal limit is 5-7°C above the average annual maximum temperature. This predicted safety margin approximately matches that found in nature, and once established is sufficient, in our simulations, to allow some limpet populations to survive a drastic, century-long increase in air temperature. By contrast, in the absence of environmental stochasticity, the safety margin is dramatically reduced. We suggest that the risk of exceeding the safety margin, rather than the absolute value of the safety margin, plays an underappreciated role in the evolution of thermal tolerance. Our predictions are based on a simple, hypothetical, allelic model that connects genetics to thermal physiology. To move beyond this simple model - and thereby potentially to predict differential evolution among populations and among species - will require significant advances in our ability to translate the details of thermal histories into physiological and population-genetic consequences.
NASA Astrophysics Data System (ADS)
Maestro, A.; Jané, G.; Llave, E.; López-Martínez, J.; Bohoyo, F.; Druet, M.
2018-06-01
The identification of recent major tectonic structures in the Galicia continental margin and adjacent abyssal plains was carried out by means of a quantitative analysis of the linear structures having bathymetric expression on the seabed. It was possible to identify about 5800 lineaments throughout the entire study area, of approximately 271,500 km2. Most lineaments are located in the Charcot and Coruña highs, in the western sector of the Galicia Bank, in the area of the Marginal Platforms and in the northern sector of the margin. Analysis of the lineament orientations shows a predominant NE-SW direction and three relative maximum directions: NW-SE, E-W and N-S. The total length of the lineaments identified is over 44,000 km, with a mode around 5000 m and an average length of about 7800 m. In light of different tectonic studies undertaken in the northwestern margin of the Iberian Peninsula, we establish that the lineaments obtained from analysis of the digital bathymetric model of the Galicia continental margin and adjacent abyssal plains would correspond to fracture systems. In general, the orientation of lineaments corresponds to main faults, tectonic structures following the directions of ancient faults that resulted from late stages of the Variscan orogeny and Mesozoic extension phases related to Triassic rifting and Upper Jurassic to Early Cretaceous opening of the North Atlantic Ocean. The N-S convergence between Eurasian and African plates since Palaeogene times until the Miocene, and NW-SE convergence from Neogene to present, reactivated the Variscan and Mesozoic fault systems and related physiography.
Inter- and Intrafraction Uncertainty in Prostate Bed Image-Guided Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Kitty; Palma, David A.; Department of Oncology, University of Western Ontario, London
2012-10-01
Purpose: The goals of this study were to measure inter- and intrafraction setup error and prostate bed motion (PBM) in patients undergoing post-prostatectomy image-guided radiotherapy (IGRT) and to propose appropriate population-based three-dimensional clinical target volume to planning target volume (CTV-PTV) margins in both non-IGRT and IGRT scenarios. Methods and Materials: In this prospective study, 14 patients underwent adjuvant or salvage radiotherapy to the prostate bed under image guidance using linac-based kilovoltage cone-beam CT (kV-CBCT). Inter- and intrafraction uncertainty/motion was assessed by offline analysis of three consecutive daily kV-CBCT images of each patient: (1) after initial setup to skin marks, (2)more » after correction for positional error/immediately before radiation treatment, and (3) immediately after treatment. Results: The magnitude of interfraction PBM was 2.1 mm, and intrafraction PBM was 0.4 mm. The maximum inter- and intrafraction prostate bed motion was primarily in the anterior-posterior direction. Margins of at least 3-5 mm with IGRT and 4-7 mm without IGRT (aligning to skin marks) will ensure 95% of the prescribed dose to the clinical target volume in 90% of patients. Conclusions: PBM is a predominant source of intrafraction error compared with setup error and has implications for appropriate PTV margins. Based on inter- and estimated intrafraction motion of the prostate bed using pre- and post-kV-CBCT images, CBCT IGRT to correct for day-to-day variances can potentially reduce CTV-PTV margins by 1-2 mm. CTV-PTV margins for prostate bed treatment in the IGRT and non-IGRT scenarios are proposed; however, in cases with more uncertainty of target delineation and image guidance accuracy, larger margins are recommended.« less
Assessment of Marginal Adaptation and Sealing Ability of Root Canal Sealers: An in vitro Study.
Remy, Vimal; Krishnan, Vineesh; Job, Tisson V; Ravisankar, Madhavankutty S; Raj, C V Renjith; John, Seena
2017-12-01
This study aims to compare the marginal adaptation and sealing ability [mineral trioxide aggregate (MTA)-Fillapex, AH Plus, Endofill sealers] of root canal sealers. In the present study, the inclusion criteria include 45 single-rooted extracted mandibular premolar teeth, with single canal and complete root formation. The sectioning of the samples was done at the cementoenamel junction using a low-speed diamond disc. Step-back technique was used to prepare root canals manually. The MTA-Fillapex, AH Plus, and Endofill sealers were the three experimental sealer groups to which 45 teeth were distributed. Under scanning electron microscope (SEM), marginal gap at sealer and root dentin interface were examined at coronal and apical halves of root canal. Among the three maximum marginal adaptations were seen with AH Plus sealer (4.10 ± 0.10) which is followed by Endofill sealer (1.44 ± 0.18) and MTA-Fillapex sealer (0.80 ± 0.22). Between the coronal and apical marginal adaptation, significant statistical difference (p = 0.001) was seen in AH Plus sealer. When a Mann-Whitney U-test was done on MTA-Fillapex sealer vs AH Plus sealer and AH Plus sealer vs Endofill sealer, there was a statistically significant difference (p < 0.05) found between the above two groups at coronal and apical third. The present study proves that AH Plus sealer has a better marginal adaptation when compared with other sealers used. For sealing space of crown wall and main cone in root canal treatment, sealers play an important role. The other advantages of sealers are that they are used to fill voids and irregularities in root channel, secondary, lateral channels, and space between applied gutta-percha cones and also act as tripper during filling.
The Importance of Normalization on Large and Heterogeneous Microarray Datasets
DNA microarray technology is a powerful functional genomics tool increasingly used for investigating global gene expression in environmental studies. Microarrays can also be used in identifying biological networks, as they give insight on the complex gene-to-gene interactions, ne...
O-Charoen, Sirimon; Srivannavit, Onnop; Gulari, Erdogan
2008-01-01
Microfluidic microarrays have been developed for economical and rapid parallel synthesis of oligonucleotide and peptide libraries. For a synthesis system to be reproducible and uniform, it is crucial to have a uniform reagent delivery throughout the system. Computational fluid dynamics (CFD) is used to model and simulate the microfluidic microarrays to study geometrical effects on flow patterns. By proper design geometry, flow uniformity could be obtained in every microreactor in the microarrays. PMID:17480053
The application of DNA microarrays in gene expression analysis.
van Hal, N L; Vorst, O; van Houwelingen, A M; Kok, E J; Peijnenburg, A; Aharoni, A; van Tunen, A J; Keijer, J
2000-03-31
DNA microarray technology is a new and powerful technology that will substantially increase the speed of molecular biological research. This paper gives a survey of DNA microarray technology and its use in gene expression studies. The technical aspects and their potential improvements are discussed. These comprise array manufacturing and design, array hybridisation, scanning, and data handling. Furthermore, it is discussed how DNA microarrays can be applied in the working fields of: safety, functionality and health of food and gene discovery and pathway engineering in plants.
Sandwich ELISA Microarrays: Generating Reliable and Reproducible Assays for High-Throughput Screens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, Rachel M.; Varnum, Susan M.; Zangar, Richard C.
The sandwich ELISA microarray is a powerful screening tool in biomarker discovery and validation due to its ability to simultaneously probe for multiple proteins in a miniaturized assay. The technical challenges of generating and processing the arrays are numerous. However, careful attention to possible pitfalls in the development of your antibody microarray assay can overcome these challenges. In this chapter, we describe in detail the steps that are involved in generating a reliable and reproducible sandwich ELISA microarray assay.
Comparison of RNA-seq and microarray-based models for clinical endpoint prediction.
Zhang, Wenqian; Yu, Ying; Hertwig, Falk; Thierry-Mieg, Jean; Zhang, Wenwei; Thierry-Mieg, Danielle; Wang, Jian; Furlanello, Cesare; Devanarayan, Viswanath; Cheng, Jie; Deng, Youping; Hero, Barbara; Hong, Huixiao; Jia, Meiwen; Li, Li; Lin, Simon M; Nikolsky, Yuri; Oberthuer, André; Qing, Tao; Su, Zhenqiang; Volland, Ruth; Wang, Charles; Wang, May D; Ai, Junmei; Albanese, Davide; Asgharzadeh, Shahab; Avigad, Smadar; Bao, Wenjun; Bessarabova, Marina; Brilliant, Murray H; Brors, Benedikt; Chierici, Marco; Chu, Tzu-Ming; Zhang, Jibin; Grundy, Richard G; He, Min Max; Hebbring, Scott; Kaufman, Howard L; Lababidi, Samir; Lancashire, Lee J; Li, Yan; Lu, Xin X; Luo, Heng; Ma, Xiwen; Ning, Baitang; Noguera, Rosa; Peifer, Martin; Phan, John H; Roels, Frederik; Rosswog, Carolina; Shao, Susan; Shen, Jie; Theissen, Jessica; Tonini, Gian Paolo; Vandesompele, Jo; Wu, Po-Yen; Xiao, Wenzhong; Xu, Joshua; Xu, Weihong; Xuan, Jiekun; Yang, Yong; Ye, Zhan; Dong, Zirui; Zhang, Ke K; Yin, Ye; Zhao, Chen; Zheng, Yuanting; Wolfinger, Russell D; Shi, Tieliu; Malkas, Linda H; Berthold, Frank; Wang, Jun; Tong, Weida; Shi, Leming; Peng, Zhiyu; Fischer, Matthias
2015-06-25
Gene expression profiling is being widely applied in cancer research to identify biomarkers for clinical endpoint prediction. Since RNA-seq provides a powerful tool for transcriptome-based applications beyond the limitations of microarrays, we sought to systematically evaluate the performance of RNA-seq-based and microarray-based classifiers in this MAQC-III/SEQC study for clinical endpoint prediction using neuroblastoma as a model. We generate gene expression profiles from 498 primary neuroblastomas using both RNA-seq and 44 k microarrays. Characterization of the neuroblastoma transcriptome by RNA-seq reveals that more than 48,000 genes and 200,000 transcripts are being expressed in this malignancy. We also find that RNA-seq provides much more detailed information on specific transcript expression patterns in clinico-genetic neuroblastoma subgroups than microarrays. To systematically compare the power of RNA-seq and microarray-based models in predicting clinical endpoints, we divide the cohort randomly into training and validation sets and develop 360 predictive models on six clinical endpoints of varying predictability. Evaluation of factors potentially affecting model performances reveals that prediction accuracies are most strongly influenced by the nature of the clinical endpoint, whereas technological platforms (RNA-seq vs. microarrays), RNA-seq data analysis pipelines, and feature levels (gene vs. transcript vs. exon-junction level) do not significantly affect performances of the models. We demonstrate that RNA-seq outperforms microarrays in determining the transcriptomic characteristics of cancer, while RNA-seq and microarray-based models perform similarly in clinical endpoint prediction. Our findings may be valuable to guide future studies on the development of gene expression-based predictive models and their implementation in clinical practice.
van Huet, Ramon A. C.; Pierrache, Laurence H.M.; Meester-Smoor, Magda A.; Klaver, Caroline C.W.; van den Born, L. Ingeborgh; Hoyng, Carel B.; de Wijs, Ilse J.; Collin, Rob W. J.; Hoefsloot, Lies H.
2015-01-01
Purpose To determine the efficacy of multiple versions of a commercially available arrayed primer extension (APEX) microarray chip for autosomal recessive retinitis pigmentosa (arRP). Methods We included 250 probands suspected of arRP who were genetically analyzed with the APEX microarray between January 2008 and November 2013. The mode of inheritance had to be autosomal recessive according to the pedigree (including isolated cases). If the microarray identified a heterozygous mutation, we performed Sanger sequencing of exons and exon–intron boundaries of that specific gene. The efficacy of this microarray chip with the additional Sanger sequencing approach was determined by the percentage of patients that received a molecular diagnosis. We also collected data from genetic tests other than the APEX analysis for arRP to provide a detailed description of the molecular diagnoses in our study cohort. Results The APEX microarray chip for arRP identified the molecular diagnosis in 21 (8.5%) of the patients in our cohort. Additional Sanger sequencing yielded a second mutation in 17 patients (6.8%), thereby establishing the molecular diagnosis. In total, 38 patients (15.2%) received a molecular diagnosis after analysis using the microarray and additional Sanger sequencing approach. Further genetic analyses after a negative result of the arRP microarray (n = 107) resulted in a molecular diagnosis of arRP (n = 23), autosomal dominant RP (n = 5), X-linked RP (n = 2), and choroideremia (n = 1). Conclusions The efficacy of the commercially available APEX microarray chips for arRP appears to be low, most likely caused by the limitations of this technique and the genetic and allelic heterogeneity of RP. Diagnostic yields up to 40% have been reported for next-generation sequencing (NGS) techniques that, as expected, thereby outperform targeted APEX analysis. PMID:25999674
Best practices for hybridization design in two-colour microarray analysis.
Knapen, Dries; Vergauwen, Lucia; Laukens, Kris; Blust, Ronny
2009-07-01
Two-colour microarrays are a popular platform of choice in gene expression studies. Because two different samples are hybridized on a single microarray, and several microarrays are usually needed in a given experiment, there are many possible ways to combine samples on different microarrays. The actual combination employed is commonly referred to as the 'hybridization design'. Different types of hybridization designs have been developed, all aimed at optimizing the experimental setup for the detection of differentially expressed genes while coping with technical noise. Here, we first provide an overview of the different classes of hybridization designs, discussing their advantages and limitations, and then we illustrate the current trends in the use of different hybridization design types in contemporary research.
Experimental Approaches to Microarray Analysis of Tumor Samples
ERIC Educational Resources Information Center
Furge, Laura Lowe; Winter, Michael B.; Meyers, Jacob I.; Furge, Kyle A.
2008-01-01
Comprehensive measurement of gene expression using high-density nucleic acid arrays (i.e. microarrays) has become an important tool for investigating the molecular differences in clinical and research samples. Consequently, inclusion of discussion in biochemistry, molecular biology, or other appropriate courses of microarray technologies has…
Challenges of microarray applications for microbial detection and gene expression profiling in food
USDA-ARS?s Scientific Manuscript database
Microarray technology represents one of the latest advances in molecular biology. The diverse types of microarrays have been applied to clinical and environmental microbiology, microbial ecology, and in human, veterinary, and plant diagnostics. Since multiple genes can be analyzed simultaneously, ...
CEM-designer: design of custom expression microarrays in the post-ENCODE Era.
Arnold, Christian; Externbrink, Fabian; Hackermüller, Jörg; Reiche, Kristin
2014-11-10
Microarrays are widely used in gene expression studies, and custom expression microarrays are popular to monitor expression changes of a customer-defined set of genes. However, the complexity of transcriptomes uncovered recently make custom expression microarray design a non-trivial task. Pervasive transcription and alternative processing of transcripts generate a wealth of interweaved transcripts that requires well-considered probe design strategies and is largely neglected in existing approaches. We developed the web server CEM-Designer that facilitates microarray platform independent design of custom expression microarrays for complex transcriptomes. CEM-Designer covers (i) the collection and generation of a set of unique target sequences from different sources and (ii) the selection of a set of sensitive and specific probes that optimally represents the target sequences. Probe design itself is left to third party software to ensure that probes meet provider-specific constraints. CEM-Designer is available at http://designpipeline.bioinf.uni-leipzig.de. Copyright © 2014 Elsevier B.V. All rights reserved.
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
Spot detection and image segmentation in DNA microarray data.
Qin, Li; Rueda, Luis; Ali, Adnan; Ngom, Alioune
2005-01-01
Following the invention of microarrays in 1994, the development and applications of this technology have grown exponentially. The numerous applications of microarray technology include clinical diagnosis and treatment, drug design and discovery, tumour detection, and environmental health research. One of the key issues in the experimental approaches utilising microarrays is to extract quantitative information from the spots, which represent genes in a given experiment. For this process, the initial stages are important and they influence future steps in the analysis. Identifying the spots and separating the background from the foreground is a fundamental problem in DNA microarray data analysis. In this review, we present an overview of state-of-the-art methods for microarray image segmentation. We discuss the foundations of the circle-shaped approach, adaptive shape segmentation, histogram-based methods and the recently introduced clustering-based techniques. We analytically show that clustering-based techniques are equivalent to the one-dimensional, standard k-means clustering algorithm that utilises the Euclidean distance.
Caryoscope: An Open Source Java application for viewing microarray data in a genomic context
Awad, Ihab AB; Rees, Christian A; Hernandez-Boussard, Tina; Ball, Catherine A; Sherlock, Gavin
2004-01-01
Background Microarray-based comparative genome hybridization experiments generate data that can be mapped onto the genome. These data are interpreted more easily when represented graphically in a genomic context. Results We have developed Caryoscope, which is an open source Java application for visualizing microarray data from array comparative genome hybridization experiments in a genomic context. Caryoscope can read General Feature Format files (GFF files), as well as comma- and tab-delimited files, that define the genomic positions of the microarray reporters for which data are obtained. The microarray data can be browsed using an interactive, zoomable interface, which helps users identify regions of chromosomal deletion or amplification. The graphical representation of the data can be exported in a number of graphic formats, including publication-quality formats such as PostScript. Conclusion Caryoscope is a useful tool that can aid in the visualization, exploration and interpretation of microarray data in a genomic context. PMID:15488149
Grubaugh, Nathan D.; Petz, Lawrence N.; Melanson, Vanessa R.; McMenamy, Scott S.; Turell, Michael J.; Long, Lewis S.; Pisarcik, Sarah E.; Kengluecha, Ampornpan; Jaichapor, Boonsong; O'Guinn, Monica L.; Lee, John S.
2013-01-01
Highly multiplexed assays, such as microarrays, can benefit arbovirus surveillance by allowing researchers to screen for hundreds of targets at once. We evaluated amplification strategies and the practicality of a portable DNA microarray platform to analyze virus-infected mosquitoes. The prototype microarray design used here targeted the non-structural protein 5, ribosomal RNA, and cytochrome b genes for the detection of flaviviruses, mosquitoes, and bloodmeals, respectively. We identified 13 of 14 flaviviruses from virus inoculated mosquitoes and cultured cells. Additionally, we differentiated between four mosquito genera and eight whole blood samples. The microarray platform was field evaluated in Thailand and successfully identified flaviviruses (Culex flavivirus, dengue-3, and Japanese encephalitis viruses), differentiated between mosquito genera (Aedes, Armigeres, Culex, and Mansonia), and detected mammalian bloodmeals (human and dog). We showed that the microarray platform and amplification strategies described here can be used to discern specific information on a wide variety of viruses and their vectors. PMID:23249687
Guo, Qingsheng; Bai, Zhixiong; Liu, Yuqian; Sun, Qingjiang
2016-03-15
In this work, we report the application of streptavidin-coated quantum dot (strAV-QD) in molecular beacon (MB) microarray assays by using the strAV-QD to label the immobilized MB, avoiding target labeling and meanwhile obviating the use of amplification. The MBs are stem-loop structured oligodeoxynucleotides, modified with a thiol and a biotin at two terminals of the stem. With the strAV-QD labeling an "opened" MB rather than a "closed" MB via streptavidin-biotin reaction, a sensitive and specific detection of label-free target DNA sequence is demonstrated by the MB microarray, with a signal-to-background ratio of 8. The immobilized MBs can be perfectly regenerated, allowing the reuse of the microarray. The MB microarray also is able to detect single nucleotide polymorphisms, exhibiting genotype-dependent fluorescence signals. It is demonstrated that the MB microarray can perform as a 4-to-2 encoder, compressing the genotype information into two outputs. Copyright © 2015 Elsevier B.V. All rights reserved.
MIGS-GPU: Microarray Image Gridding and Segmentation on the GPU.
Katsigiannis, Stamos; Zacharia, Eleni; Maroulis, Dimitris
2017-05-01
Complementary DNA (cDNA) microarray is a powerful tool for simultaneously studying the expression level of thousands of genes. Nevertheless, the analysis of microarray images remains an arduous and challenging task due to the poor quality of the images that often suffer from noise, artifacts, and uneven background. In this study, the MIGS-GPU [Microarray Image Gridding and Segmentation on Graphics Processing Unit (GPU)] software for gridding and segmenting microarray images is presented. MIGS-GPU's computations are performed on the GPU by means of the compute unified device architecture (CUDA) in order to achieve fast performance and increase the utilization of available system resources. Evaluation on both real and synthetic cDNA microarray images showed that MIGS-GPU provides better performance than state-of-the-art alternatives, while the proposed GPU implementation achieves significantly lower computational times compared to the respective CPU approaches. Consequently, MIGS-GPU can be an advantageous and useful tool for biomedical laboratories, offering a user-friendly interface that requires minimum input in order to run.
2012-01-01
Over the last decade, the introduction of microarray technology has had a profound impact on gene expression research. The publication of studies with dissimilar or altogether contradictory results, obtained using different microarray platforms to analyze identical RNA samples, has raised concerns about the reliability of this technology. The MicroArray Quality Control (MAQC) project was initiated to address these concerns, as well as other performance and data analysis issues. Expression data on four titration pools from two distinct reference RNA samples were generated at multiple test sites using a variety of microarray-based and alternative technology platforms. Here we describe the experimental design and probe mapping efforts behind the MAQC project. We show intraplatform consistency across test sites as well as a high level of interplatform concordance in terms of genes identified as differentially expressed. This study provides a resource that represents an important first step toward establishing a framework for the use of microarrays in clinical and regulatory settings. PMID:16964229
Parthasarathy, N; Saksena, R; Kováč, P; Deshazer, D; Peacock, S J; Wuthiekanun, V; Heine, H S; Friedlander, A M; Cote, C K; Welkos, S L; Adamovicz, J J; Bavari, S; Waag, D M
2008-11-03
We developed a microarray platform by immobilizing bacterial 'signature' carbohydrates onto epoxide modified glass slides. The carbohydrate microarray platform was probed with sera from non-melioidosis and melioidosis (Burkholderia pseudomallei) individuals. The platform was also probed with sera from rabbits vaccinated with Bacillus anthracis spores and Francisella tularensis bacteria. By employing this microarray platform, we were able to detect and differentiate B. pseudomallei, B. anthracis and F. tularensis antibodies in infected patients, and infected or vaccinated animals. These antibodies were absent in the sera of naïve test subjects. The advantages of the carbohydrate microarray technology over the traditional indirect hemagglutination and microagglutination tests for the serodiagnosis of melioidosis and tularemia are discussed. Furthermore, this array is a multiplex carbohydrate microarray for the detection of all three biothreat bacterial infections including melioidosis, anthrax and tularemia with one, multivalent device. The implication is that this technology could be expanded to include a wide array of infectious and biothreat agents.
Split-plot microarray experiments: issues of design, power and sample size.
Tsai, Pi-Wen; Lee, Mei-Ling Ting
2005-01-01
This article focuses on microarray experiments with two or more factors in which treatment combinations of the factors corresponding to the samples paired together onto arrays are not completely random. A main effect of one (or more) factor(s) is confounded with arrays (the experimental blocks). This is called a split-plot microarray experiment. We utilise an analysis of variance (ANOVA) model to assess differentially expressed genes for between-array and within-array comparisons that are generic under a split-plot microarray experiment. Instead of standard t- or F-test statistics that rely on mean square errors of the ANOVA model, we use a robust method, referred to as 'a pooled percentile estimator', to identify genes that are differentially expressed across different treatment conditions. We illustrate the design and analysis of split-plot microarray experiments based on a case application described by Jin et al. A brief discussion of power and sample size for split-plot microarray experiments is also presented.
Women's experiences receiving abnormal prenatal chromosomal microarray testing results.
Bernhardt, Barbara A; Soucier, Danielle; Hanson, Karen; Savage, Melissa S; Jackson, Laird; Wapner, Ronald J
2013-02-01
Genomic microarrays can detect copy-number variants not detectable by conventional cytogenetics. This technology is diffusing rapidly into prenatal settings even though the clinical implications of many copy-number variants are currently unknown. We conducted a qualitative pilot study to explore the experiences of women receiving abnormal results from prenatal microarray testing performed in a research setting. Participants were a subset of women participating in a multicenter prospective study "Prenatal Cytogenetic Diagnosis by Array-based Copy Number Analysis." Telephone interviews were conducted with 23 women receiving abnormal prenatal microarray results. We found that five key elements dominated the experiences of women who had received abnormal prenatal microarray results: an offer too good to pass up, blindsided by the results, uncertainty and unquantifiable risks, need for support, and toxic knowledge. As prenatal microarray testing is increasingly used, uncertain findings will be common, resulting in greater need for careful pre- and posttest counseling, and more education of and resources for providers so they can adequately support the women who are undergoing testing.
Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.
Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi
2013-01-01
The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.
A coupled ice-ocean model of ice breakup and banding in the marginal ice zone
NASA Technical Reports Server (NTRS)
Smedstad, O. M.; Roed, L. P.
1985-01-01
A coupled ice-ocean numerical model for the marginal ice zone is considered. The model consists of a nonlinear sea ice model and a two-layer (reduced gravity) ocean model. The dependence of the upwelling response on wind stress direction is discussed. The results confirm earlier analytical work. It is shown that there exist directions for which there is no upwelling, while other directions give maximum upwelling in terms of the volume of uplifted water. The ice and ocean is coupled directly through the stress at the ice-ocean interface. An interesting consequence of the coupling is found in cases when the ice edge is almost stationary. In these cases the ice tends to break up a few tenths of kilometers inside of the ice edge.
Abou Assi, Hala; Gómez-Pinto, Irene; González, Carlos
2017-01-01
Abstract In situ fabricated nucleic acids microarrays are versatile and very high-throughput platforms for aptamer optimization and discovery, but the chemical space that can be probed against a given target has largely been confined to DNA, while RNA and non-natural nucleic acid microarrays are still an essentially uncharted territory. 2΄-Fluoroarabinonucleic acid (2΄F-ANA) is a prime candidate for such use in microarrays. Indeed, 2΄F-ANA chemistry is readily amenable to photolithographic microarray synthesis and its potential in high affinity aptamers has been recently discovered. We thus synthesized the first microarrays containing 2΄F-ANA and 2΄F-ANA/DNA chimeric sequences to fully map the binding affinity landscape of the TBA1 thrombin-binding G-quadruplex aptamer containing all 32 768 possible DNA-to-2΄F-ANA mutations. The resulting microarray was screened against thrombin to identify a series of promising 2΄F-ANA-modified aptamer candidates with Kds significantly lower than that of the unmodified control and which were found to adopt highly stable, antiparallel-folded G-quadruplex structures. The solution structure of the TBA1 aptamer modified with 2΄F-ANA at position T3 shows that fluorine substitution preorganizes the dinucleotide loop into the proper conformation for interaction with thrombin. Overall, our work strengthens the potential of 2΄F-ANA in aptamer research and further expands non-genomic applications of nucleic acids microarrays. PMID:28100695
Geue, Lutz; Stieber, Bettina; Monecke, Stefan; Engelmann, Ines; Gunzer, Florian; Slickers, Peter; Braun, Sascha D; Ehricht, Ralf
2014-08-01
In this study, we developed a new rapid, economic, and automated microarray-based genotyping test for the standardized subtyping of Shiga toxins 1 and 2 of Escherichia coli. The microarrays from Alere Technologies can be used in two different formats, the ArrayTube and the ArrayStrip (which enables high-throughput testing in a 96-well format). One microarray chip harbors all the gene sequences necessary to distinguish between all Stx subtypes, facilitating the identification of single and multiple subtypes within a single isolate in one experiment. Specific software was developed to automatically analyze all data obtained from the microarray. The assay was validated with 21 Shiga toxin-producing E. coli (STEC) reference strains that were previously tested by the complete set of conventional subtyping PCRs. The microarray results showed 100% concordance with the PCR results. Essentially identical results were detected when the standard DNA extraction method was replaced by a time-saving heat lysis protocol. For further validation of the microarray, we identified the Stx subtypes or combinations of the subtypes in 446 STEC field isolates of human and animal origin. In summary, this oligonucleotide array represents an excellent diagnostic tool that provides some advantages over standard PCR-based subtyping. The number of the spotted probes on the microarrays can be increased by additional probes, such as for novel alleles, species markers, or resistance genes, should the need arise. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Mohamed, Abdallah S R; Cardenas, Carlos E; Garden, Adam S; Awan, Musaddiq J; Rock, Crosby D; Westergaard, Sarah A; Brandon Gunn, G; Belal, Abdelaziz M; El-Gowily, Ahmed G; Lai, Stephen Y; Rosenthal, David I; Fuller, Clifton D; Aristophanous, Michalis
2017-08-01
To identify the radio-resistant subvolumes in pretreatment FDG-PET by mapping the spatial location of the origin of tumor recurrence after IMRT for head-and-neck squamous cell cancer to the pretreatment FDG-PET/CT. Patients with local/regional recurrence after IMRT with available FDG-PET/CT and post-failure CT were included. For each patient, both pre-therapy PET/CT and recurrence CT were co-registered with the planning CT (pCT). A 4-mm radius was added to the centroid of mapped recurrence growth target volumes (rGTV's) to create recurrence nidus-volumes (NVs). The overlap between boost-tumor-volumes (BTV) representing different SUV thresholds/margins combinations and NVs was measured. Forty-seven patients were eligible. Forty-two (89.4%) had type A central high dose failure. Twenty-six (48%) of type A rGTVs were at the primary site and 28 (52%) were at the nodal site. The mean dose of type A rGTVs was 71Gy. BTV consisting of 50% of the maximum SUV plus 10mm margin was the best subvolume for dose boosting due to high coverage of primary site NVs (92.3%), low average relative volume to CTV1 (41%), and least average percent voxels outside CTV1 (19%). The majority of loco-regional recurrences originate in the regions of central-high-dose. When correlated with pretreatment FDG-PET, the majority of recurrences originated in an area that would be covered by additional 10mm margin on the volume of 50% of the maximum FDG uptake. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fury, Matthew G.; Department of Medicine, Weill Cornell Medical College, New York, New York; Lee, Nancy Y.
Purpose: Elevated expression of eukaryotic protein synthesis initiation factor 4E (eIF4E) in histologically cancer-free margins of resected head and neck squamous cell carcinomas (HNSCCs) is mediated by mammalian target of rapamycin complex 1 (mTORC1) and has been associated with increased risk of disease recurrence. Preclinically, inhibition of mTORC1 with everolimus sensitizes cancer cells to cisplatin and radiation. Methods and Materials: This was single-institution phase 1 study to establish the maximum tolerated dose of daily everolimus given with fixed dose cisplatin (30 mg/m{sup 2} weekly × 6) and concurrent intensity modulated radiation therapy for patients with locally and/or regionally advanced head-and-neckmore » cancer. The study had a standard 3 + 3 dose-escalation design. Results: Tumor primary sites were oral cavity (4), salivary gland (4), oropharynx (2), nasopharynx (1), scalp (1), and neck node with occult primary (1). In 4 of 4 cases in which resected HNSCC surgical pathology specimens were available for immunohistochemistry, elevated expression of eIF4E was observed in the cancer-free margins. The most common grade ≥3 treatment-related adverse event was lymphopenia (92%), and dose-limiting toxicities (DLTs) were mucositis (n=2) and failure to thrive (n=1). With a median follow up of 19.4 months, 2 patients have experienced recurrent disease. The maximum tolerated dose was everolimus 5 mg/day. Conclusions: Head-and-neck cancer patients tolerated everolimus at therapeutic doses (5 mg/day) given with weekly cisplatin and intensity modulated radiation therapy. The regimen merits further evaluation, especially among patients who are status post resection of HNSCCs that harbor mTORC1-mediated activation of eIF4E in histologically negative surgical margins.« less
NASA Astrophysics Data System (ADS)
Dove, Dayton; Evans, David J. A.; Lee, Jonathan R.; Roberts, David H.; Tappin, David R.; Mellett, Claire L.; Long, David; Callard, S. Louise
2017-05-01
Along the terrestrial margin of the southern North Sea, previous studies of the MIS 2 glaciation impacting eastern Britain have played a significant role in the development of principles relating to ice sheet dynamics (e.g. deformable beds), and the practice of reconstructing the style, timing, and spatial configuration of palaeo-ice sheets. These detailed terrestrially-based findings have however relied on observations made from only the outer edges of the former ice mass, as the North Sea Lobe (NSL) of the British-Irish Ice Sheet (BIIS) occupied an area that is now almost entirely submarine (c.21-15 ka). Compounded by the fact that marine-acquired data have been primarily of insufficient quality and density, the configuration and behaviour of the last BIIS in the southern North Sea remains surprisingly poorly constrained. This paper presents analysis of a new, integrated set of extensive seabed geomorphological and seismo-stratigraphic observations that both advances the principles developed previously onshore (e.g. multiple advance and retreat cycles), and provides a more detailed and accurate reconstruction of the BIIS at its southern-most extent in the North Sea. A new bathymetry compilation of the region reveals a series of broad sedimentary wedges and associated moraines that represent several terminal positions of the NSL. These former still-stand ice margins (1-4) are also found to relate to newly-identified architectural patterns (shallow stacked sedimentary wedges) in the region's seismic stratigraphy (previously mapped singularly as the Bolders Bank Formation). With ground-truthing constraint provided by sediment cores, these wedges are interpreted as sub-marginal till wedges, formed by complex subglacial accretionary processes that resulted in till thickening towards the former ice-sheet margins. The newly sub-divided shallow seismic stratigraphy (at least five units) also provides an indication of the relative event chronology of the NSL. While there is a general record of south-to-north retreat, seismic data also indicate episodes of ice-sheet re-advance suggestive of an oscillating margin (e.g. MIS 2 maximum not related to first incursion of ice into region). Demonstrating further landform interdependence, geographically-grouped sets of tunnel valleys are shown to be genetically related to these individual ice margins, providing clear insight into how meltwater drainage was organised at the evolving termini of this dynamic ice lobe. The newly reconstructed offshore ice margins are found to be well correlated with previously observed terrestrial limits in Lincolnshire and E. Yorkshire (Holderness) (e.g. MIS 2 maximum and Withernsea Till). This reconstruction will hopefully provide a useful framework for studies targeting the climatic, mass-balance, and external glaciological factors (i.e. Fennoscandian Ice Sheet) that influenced late-stage advance and deglaciation, important for accurately characterising both modern and palaeo-ice sheets.
Schneider, Lea; Rinke, Sven
2018-01-01
This study evaluated the marginal accuracy of CAD/CAM-fabricated crown copings from four different materials within the same processing route. Twenty stone replicas of a metallic master die (prepared upper premolar) were scanned and divided into two groups. Group 1 (n = 10) was used for a pilot test to determine the design parameters for best marginal accuracy. Group 2 (n = 10) was used to fabricate 10 specimens from the following materials with one identical CAD/CAM system (GAMMA 202, Wissner GmbH, Goettingen, Germany): A = commercially pure (cp) titanium, B = cobalt-chromium alloy, C = yttria-stabilized zirconia (YSZ), and D = leucite-reinforced glass-ceramics. Copings from group 2 were evaluated for the mean marginal gap size (MeanMG) and average maximum marginal gap size (AMaxMG) with a light microscope in the “as-machined” state. The effect of the material on the marginal accuracy was analyzed by multiple pairwise comparisons (Mann–Whitney, U-test, α = 0.05, adjusted by Bonferroni-Holmes method). MeanMG values were as follows: A: 46.92 ± 23.12 μm, B: 48.37 ± 29.72 μm, C: 68.25 ± 28.54 μm, and D: 58.73 ± 21.15 μm. The differences in the MeanMG values proved to be significant for groups A/C (p = 0.0024), A/D (p = 0.008), and B/C (p = 0.0332). AMaxMG values (A: 91.54 ± 23.39 μm, B: 96.86 ± 24.19 μm, C: 120.66 ± 32.75 μm, and D: 100.22 ± 10.83 μm) revealed no significant differences. The material had a significant impact on the marginal accuracy of CAD/CAM-fabricated copings. PMID:29765979
Over the last decade, the introduction of microarray technology has had a profound impact on gene expression research. The publication of studies with dissimilar or altogether contradictory results, obtained using different microarray platforms to analyze identical RNA samples, ...
Mancini, E.A.; Tew, B.H.
1997-01-01
The maximum flooding event within a depositional sequence is an important datum for correlation because it represents a virtually synchronous horizon. This event is typically recognized by a distinctive physical surface and/or a significant change in microfossil assemblages (relative fossil abundance peaks) in siliciclastic deposits from shoreline to continental slope environments in a passive margin setting. Recognition of maximum flooding events in mixed siliciclastic-carbonate sediments is more complicated because the entire section usually represents deposition in continental shelf environments with varying rates of biologic and carbonate productivity versus siliciclastic influx. Hence, this event cannot be consistently identified simply by relative fossil abundance peaks. Factors such as siliciclastic input, carbonate productivity, sediment accumulation rates, and paleoenvironmental conditions dramatically affect the relative abundances of microfossils. Failure to recognize these complications can lead to a sequence stratigraphic interpretation that substantially overestimates the number of depositional sequences of 1 to 10 m.y. duration.
A Maximum Likelihood Approach to Functional Mapping of Longitudinal Binary Traits
Wang, Chenguang; Li, Hongying; Wang, Zhong; Wang, Yaqun; Wang, Ningtao; Wang, Zuoheng; Wu, Rongling
2013-01-01
Despite their importance in biology and biomedicine, genetic mapping of binary traits that change over time has not been well explored. In this article, we develop a statistical model for mapping quantitative trait loci (QTLs) that govern longitudinal responses of binary traits. The model is constructed within the maximum likelihood framework by which the association between binary responses is modeled in terms of conditional log odds-ratios. With this parameterization, the maximum likelihood estimates (MLEs) of marginal mean parameters are robust to the misspecification of time dependence. We implement an iterative procedures to obtain the MLEs of QTL genotype-specific parameters that define longitudinal binary responses. The usefulness of the model was validated by analyzing a real example in rice. Simulation studies were performed to investigate the statistical properties of the model, showing that the model has power to identify and map specific QTLs responsible for the temporal pattern of binary traits. PMID:23183762
Maximum entropy deconvolution of the optical jet of 3C 273
NASA Technical Reports Server (NTRS)
Evans, I. N.; Ford, H. C.; Hui, X.
1989-01-01
The technique of maximum entropy image restoration is applied to the problem of deconvolving the point spread function from a deep, high-quality V band image of the optical jet of 3C 273. The resulting maximum entropy image has an approximate spatial resolution of 0.6 arcsec and has been used to study the morphology of the optical jet. Four regularly-spaced optical knots are clearly evident in the data, together with an optical 'extension' at each end of the optical jet. The jet oscillates around its center of gravity, and the spatial scale of the oscillations is very similar to the spacing between the optical knots. The jet is marginally resolved in the transverse direction and has an asymmetric profile perpendicular to the jet axis. The distribution of V band flux along the length of the jet, and accurate astrometry of the optical knot positions are presented.
NASA Astrophysics Data System (ADS)
Schildgen, T. F.; Cosentino, D.; Caruso, A.; Yildirim, C.; Echtler, H.; Strecker, M. R.
2011-12-01
The Central Anatolian plateau in Turkey borders one of the most complex tectonic regions on Earth, where collision of the Arabian plate with Eurasia in Eastern Anatolia transitions to a cryptic pattern of subduction of the African beneath the Eurasian plate, with concurrent westward extrusion of the Anatolian microplate. Topographic growth of the southern margin of the Central Anatolian plateau has proceeded in discrete stages that can be distinguished based on the outcrop pattern and ages of uplifted marine sediments. These marine units, together with older basement rocks and younger continental sedimentary fills, also record an evolving nature of crustal deformation and uplift patterns that can be used to test the viability of different uplift mechanisms that have contributed to generate the world's third-largest orogenic plateau. Late Miocene marine sediments outcrop along the SW plateau margin at 1.5 km elevation, while they blanket the S and SE margins at up to more than 2 km elevation. Our new biostratigraphic data limit the age of 1.5-km-high marine sediments along the SW plateau margin to < 7.17 Ma, while regional lithostratigraphic correlations imply that the age is < 6.7 Ma. After reconstructing the post-Late Miocene surface uplift pattern from elevations of uplifted marine sediments and geomorphic reference surfaces, it is clear that regional surface uplift reaches maximum values along the modern plateau margin, with the SW margin experiencing less cumulative uplift compared to the S and SE margins. Our structural measurements and inversion modeling of faults within the uplifted region agree with previous findings in surrounding regions, with early contraction followed by strike-slip and extensional deformation. Shallow earthquake focal mechanisms show that the extensional phase has continued to the present. Broad similarities in the onset of surface uplift (after 7 Ma) and a change in the kinematic evolution of the plateau margin (after 8 Ma) suggest that these phenomena may have been linked with a change in the tectonic stress field associated with the process(es) causing post-7 Ma surface uplift. The complex geometry of lithospheric slabs beneath the southern plateau margin, early Pliocene to recent alkaline volcanism, and the localized uplift pattern with accompanying tensional/transtensional stresses point toward slab tearing and localized heating at the base of the lithosphere as a probable mechanism for post-7 Ma uplift of the SW margin. Considering previous work in the region, slab break-off is more likely responsible for non-contractional uplift along the S and SE margins. Overall there appears to be an important link between slab dynamics and surface uplift across the whole southern margin of the Central Anatolian plateau.
USDA-ARS?s Scientific Manuscript database
The development of a fluorescent multiplexed microarray platform able to detect and quantify a wide variety of pollutants in seawater is reported. The microarray platform has been manufactured by spotting 6 different bioconjugate competitors and it uses a cocktail of 6 monoclonal and polyclonal anti...
Microarray technology is a powerful tool to investigate the gene expression profiles for thousands of genes simultaneously. In recent years, microarrays have been used to characterize environmental pollutants and identify molecular mode(s) of action of chemicals including endocri...
USDA-ARS?s Scientific Manuscript database
The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...
Microarrays Made Simple: "DNA Chips" Paper Activity
ERIC Educational Resources Information Center
Barnard, Betsy
2006-01-01
DNA microarray technology is revolutionizing biological science. DNA microarrays (also called DNA chips) allow simultaneous screening of many genes for changes in expression between different cells. Now researchers can obtain information about genes in days or weeks that used to take months or years. The paper activity described in this article…
Over the last decade, the introduction of microarray technology has had a profound impact on gene expression research. The publication of studies with dissimilar or altogether contradictory results, obtained using different microarray platforms to analyze identical RNA samples, h...
ERIC Educational Resources Information Center
Plomin, Robert; Schalkwyk, Leonard C.
2007-01-01
Microarrays are revolutionizing genetics by making it possible to genotype hundreds of thousands of DNA markers and to assess the expression (RNA transcripts) of all of the genes in the genome. Microarrays are slides the size of a postage stamp that contain millions of DNA sequences to which single-stranded DNA or RNA can hybridize. This…
Karampetsou, Evangelia; Morrogh, Deborah; Chitty, Lyn
2014-01-01
The advantage of microarray (array) over conventional karyotype for the diagnosis of fetal pathogenic chromosomal anomalies has prompted the use of microarrays in prenatal diagnostics. In this review we compare the performance of different array platforms (BAC, oligonucleotide CGH, SNP) and designs (targeted, whole genome, whole genome, and targeted, custom) and discuss their advantages and disadvantages in relation to prenatal testing. We also discuss the factors to consider when implementing a microarray testing service for the diagnosis of fetal chromosomal aberrations. PMID:26237396
A Perspective on DNA Microarrays in Pathology Research and Practice
Pollack, Jonathan R.
2007-01-01
DNA microarray technology matured in the mid-1990s, and the past decade has witnessed a tremendous growth in its application. DNA microarrays have provided powerful tools for pathology researchers seeking to describe, classify, and understand human disease. There has also been great expectation that the technology would advance the practice of pathology. This review highlights some of the key contributions of DNA microarrays to experimental pathology, focusing in the area of cancer research. Also discussed are some of the current challenges in translating utility to clinical practice. PMID:17600117
Bingle, Lynne; Fonseca, Felipe P; Farthing, Paula M
2017-01-01
Tissue microarrays were first constructed in the 1980s but were used by only a limited number of researchers for a considerable period of time. In the last 10 years there has been a dramatic increase in the number of publications describing the successful use of tissue microarrays in studies aimed at discovering and validating biomarkers. This, along with the increased availability of both manual and automated microarray builders on the market, has encouraged even greater use of this novel and powerful tool. This chapter describes the basic techniques required to build a tissue microarray using a manual method in order that the theory behind the practical steps can be fully explained. Guidance is given to ensure potential disadvantages of the technique are fully considered.
Identification of differentially expressed genes and false discovery rate in microarray studies.
Gusnanto, Arief; Calza, Stefano; Pawitan, Yudi
2007-04-01
To highlight the development in microarray data analysis for the identification of differentially expressed genes, particularly via control of false discovery rate. The emergence of high-throughput technology such as microarrays raises two fundamental statistical issues: multiplicity and sensitivity. We focus on the biological problem of identifying differentially expressed genes. First, multiplicity arises due to testing tens of thousands of hypotheses, rendering the standard P value meaningless. Second, known optimal single-test procedures such as the t-test perform poorly in the context of highly multiple tests. The standard approach of dealing with multiplicity is too conservative in the microarray context. The false discovery rate concept is fast becoming the key statistical assessment tool replacing the P value. We review the false discovery rate approach and argue that it is more sensible for microarray data. We also discuss some methods to take into account additional information from the microarrays to improve the false discovery rate. There is growing consensus on how to analyse microarray data using the false discovery rate framework in place of the classical P value. Further research is needed on the preprocessing of the raw data, such as the normalization step and filtering, and on finding the most sensitive test procedure.
Steger, Doris; Berry, David; Haider, Susanne; Horn, Matthias; Wagner, Michael; Stocker, Roman; Loy, Alexander
2011-01-01
The hybridization of nucleic acid targets with surface-immobilized probes is a widely used assay for the parallel detection of multiple targets in medical and biological research. Despite its widespread application, DNA microarray technology still suffers from several biases and lack of reproducibility, stemming in part from an incomplete understanding of the processes governing surface hybridization. In particular, non-random spatial variations within individual microarray hybridizations are often observed, but the mechanisms underpinning this positional bias remain incompletely explained. This study identifies and rationalizes a systematic spatial bias in the intensity of surface hybridization, characterized by markedly increased signal intensity of spots located at the boundaries of the spotted areas of the microarray slide. Combining observations from a simplified single-probe block array format with predictions from a mathematical model, the mechanism responsible for this bias is found to be a position-dependent variation in lateral diffusion of target molecules. Numerical simulations reveal a strong influence of microarray well geometry on the spatial bias. Reciprocal adjustment of the size of the microarray hybridization chamber to the area of surface-bound probes is a simple and effective measure to minimize or eliminate the diffusion-based bias, resulting in increased uniformity and accuracy of quantitative DNA microarray hybridization.
Haider, Susanne; Horn, Matthias; Wagner, Michael; Stocker, Roman; Loy, Alexander
2011-01-01
Background The hybridization of nucleic acid targets with surface-immobilized probes is a widely used assay for the parallel detection of multiple targets in medical and biological research. Despite its widespread application, DNA microarray technology still suffers from several biases and lack of reproducibility, stemming in part from an incomplete understanding of the processes governing surface hybridization. In particular, non-random spatial variations within individual microarray hybridizations are often observed, but the mechanisms underpinning this positional bias remain incompletely explained. Methodology/Principal Findings This study identifies and rationalizes a systematic spatial bias in the intensity of surface hybridization, characterized by markedly increased signal intensity of spots located at the boundaries of the spotted areas of the microarray slide. Combining observations from a simplified single-probe block array format with predictions from a mathematical model, the mechanism responsible for this bias is found to be a position-dependent variation in lateral diffusion of target molecules. Numerical simulations reveal a strong influence of microarray well geometry on the spatial bias. Conclusions Reciprocal adjustment of the size of the microarray hybridization chamber to the area of surface-bound probes is a simple and effective measure to minimize or eliminate the diffusion-based bias, resulting in increased uniformity and accuracy of quantitative DNA microarray hybridization. PMID:21858215
2015-01-01
Biological assays formatted as microarrays have become a critical tool for the generation of the comprehensive data sets required for systems-level understanding of biological processes. Manual annotation of data extracted from images of microarrays, however, remains a significant bottleneck, particularly for protein microarrays due to the sensitivity of this technology to weak artifact signal. In order to automate the extraction and curation of data from protein microarrays, we describe an algorithm called Crossword that logically combines information from multiple approaches to fully automate microarray segmentation. Automated artifact removal is also accomplished by segregating structured pixels from the background noise using iterative clustering and pixel connectivity. Correlation of the location of structured pixels across image channels is used to identify and remove artifact pixels from the image prior to data extraction. This component improves the accuracy of data sets while reducing the requirement for time-consuming visual inspection of the data. Crossword enables a fully automated protocol that is robust to significant spatial and intensity aberrations. Overall, the average amount of user intervention is reduced by an order of magnitude and the data quality is increased through artifact removal and reduced user variability. The increase in throughput should aid the further implementation of microarray technologies in clinical studies. PMID:24417579
The detection and differentiation of canine respiratory pathogens using oligonucleotide microarrays.
Wang, Lih-Chiann; Kuo, Ya-Ting; Chueh, Ling-Ling; Huang, Dean; Lin, Jiunn-Horng
2017-05-01
Canine respiratory diseases are commonly seen in dogs along with co-infections with multiple respiratory pathogens, including viruses and bacteria. Virus infections in even vaccinated dogs were also reported. The clinical signs caused by different respiratory etiological agents are similar, which makes differential diagnosis imperative. An oligonucleotide microarray system was developed in this study. The wild type and vaccine strains of canine distemper virus (CDV), influenza virus, canine herpesvirus (CHV), Bordetella bronchiseptica and Mycoplasma cynos were detected and differentiated simultaneously on a microarray chip. The detection limit is 10, 10, 100, 50 and 50 copy numbers for CDV, influenza virus, CHV, B. bronchiseptica and M. cynos, respectively. The clinical test results of nasal swab samples showed that the microarray had remarkably better efficacy than the multiplex PCR-agarose gel method. The positive detection rate of microarray and agarose gel was 59.0% (n=33) and 41.1% (n=23) among the 56 samples, respectively. CDV vaccine strain and pathogen co-infections were further demonstrated by the microarray but not by the multiplex PCR-agarose gel. The oligonucleotide microarray provides a highly efficient diagnosis alternative that could be applied to clinical usage, greatly assisting in disease therapy and control. Copyright © 2017 Elsevier B.V. All rights reserved.
Gene selection for microarray data classification via subspace learning and manifold regularization.
Tang, Chang; Cao, Lijuan; Zheng, Xiao; Wang, Minhui
2017-12-19
With the rapid development of DNA microarray technology, large amount of genomic data has been generated. Classification of these microarray data is a challenge task since gene expression data are often with thousands of genes but a small number of samples. In this paper, an effective gene selection method is proposed to select the best subset of genes for microarray data with the irrelevant and redundant genes removed. Compared with original data, the selected gene subset can benefit the classification task. We formulate the gene selection task as a manifold regularized subspace learning problem. In detail, a projection matrix is used to project the original high dimensional microarray data into a lower dimensional subspace, with the constraint that the original genes can be well represented by the selected genes. Meanwhile, the local manifold structure of original data is preserved by a Laplacian graph regularization term on the low-dimensional data space. The projection matrix can serve as an importance indicator of different genes. An iterative update algorithm is developed for solving the problem. Experimental results on six publicly available microarray datasets and one clinical dataset demonstrate that the proposed method performs better when compared with other state-of-the-art methods in terms of microarray data classification. Graphical Abstract The graphical abstract of this work.
Kumar, Mukesh; Rath, Nitish Kumar; Rath, Santanu Kumar
2016-04-01
Microarray-based gene expression profiling has emerged as an efficient technique for classification, prognosis, diagnosis, and treatment of cancer. Frequent changes in the behavior of this disease generates an enormous volume of data. Microarray data satisfies both the veracity and velocity properties of big data, as it keeps changing with time. Therefore, the analysis of microarray datasets in a small amount of time is essential. They often contain a large amount of expression, but only a fraction of it comprises genes that are significantly expressed. The precise identification of genes of interest that are responsible for causing cancer are imperative in microarray data analysis. Most existing schemes employ a two-phase process such as feature selection/extraction followed by classification. In this paper, various statistical methods (tests) based on MapReduce are proposed for selecting relevant features. After feature selection, a MapReduce-based K-nearest neighbor (mrKNN) classifier is also employed to classify microarray data. These algorithms are successfully implemented in a Hadoop framework. A comparative analysis is done on these MapReduce-based models using microarray datasets of various dimensions. From the obtained results, it is observed that these models consume much less execution time than conventional models in processing big data. Copyright © 2016 Elsevier Inc. All rights reserved.
Marginal Structural Models with Counterfactual Effect Modifiers.
Zheng, Wenjing; Luo, Zhehui; van der Laan, Mark J
2018-06-08
In health and social sciences, research questions often involve systematic assessment of the modification of treatment causal effect by patient characteristics. In longitudinal settings, time-varying or post-intervention effect modifiers are also of interest. In this work, we investigate the robust and efficient estimation of the Counterfactual-History-Adjusted Marginal Structural Model (van der Laan MJ, Petersen M. Statistical learning of origin-specific statically optimal individualized treatment rules. Int J Biostat. 2007;3), which models the conditional intervention-specific mean outcome given a counterfactual modifier history in an ideal experiment. We establish the semiparametric efficiency theory for these models, and present a substitution-based, semiparametric efficient and doubly robust estimator using the targeted maximum likelihood estimation methodology (TMLE, e.g. van der Laan MJ, Rubin DB. Targeted maximum likelihood learning. Int J Biostat. 2006;2, van der Laan MJ, Rose S. Targeted learning: causal inference for observational and experimental data, 1st ed. Springer Series in Statistics. Springer, 2011). To facilitate implementation in applications where the effect modifier is high dimensional, our third contribution is a projected influence function (and the corresponding projected TMLE estimator), which retains most of the robustness of its efficient peer and can be easily implemented in applications where the use of the efficient influence function becomes taxing. We compare the projected TMLE estimator with an Inverse Probability of Treatment Weighted estimator (e.g. Robins JM. Marginal structural models. In: Proceedings of the American Statistical Association. Section on Bayesian Statistical Science, 1-10. 1997a, Hernan MA, Brumback B, Robins JM. Marginal structural models to estimate the causal effect of zidovudine on the survival of HIV-positive men. 2000;11:561-570), and a non-targeted G-computation estimator (Robins JM. A new approach to causal inference in mortality studies with sustained exposure periods - application to control of the healthy worker survivor effect. Math Modell. 1986;7:1393-1512.). The comparative performance of these estimators is assessed in a simulation study. The use of the projected TMLE estimator is illustrated in a secondary data analysis for the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial where effect modifiers are subject to missing at random.
NASA Astrophysics Data System (ADS)
Farnsworth, L. B.; Kelly, M. A.; Axford, Y.; Bromley, G. R.; Osterberg, E. C.; Howley, J. A.; Zimmerman, S. R. H.; Jackson, M. S.; Lasher, G. E.; McFarlin, J. M.
2015-12-01
Defining the late glacial and Holocene fluctuations of the Greenland Ice Sheet (GrIS) margin, particularly during periods that were as warm or warmer than present, provides a longer-term perspective on present ice margin fluctuations and informs how the GrIS may respond to future climate conditions. We focus on mapping and dating past GrIS extents in the Nunatarssuaq region of northwestern Greenland. During the summer of 2014, we conducted geomorphic mapping and collected rock samples for 10Be surface exposure dating as well as subfossil plant samples for 14C dating. We also obtained sediment cores from an ice-proximal lake. Preliminary 10Be ages of boulders deposited during deglaciation of the GrIS subsequent to the Last Glacial Maximum range from ~30-15 ka. The apparently older ages of some samples indicate the presence of 10Be inherited from prior periods of exposure. These ages suggest deglaciation occurred by ~15 ka however further data are needed to test this hypothesis. Subfossil plants exposed at the GrIS margin on shear planes date to ~ 4.6-4.8 cal. ka BP and indicate less extensive ice during middle Holocene time. Additional radiocarbon ages from in situ subfossil plants on a nunatak date to ~3.1 cal. ka BP. Geomorphic mapping of glacial landforms near Nordsø, a large proglacial lake, including grounding lines, moraines, paleo-shorelines, and deltas, indicate the existence of a higher lake level that resulted from a more extensive GrIS margin likely during Holocene time. A fresh drift limit, characterized by unweathered, lichen-free clasts approximately 30-50 m distal to the modern GrIS margin, is estimated to be late Holocene in age. 10Be dating of samples from these geomorphic features is in progress. Radiocarbon ages of subfossil plants exposed by recent retreat of the GrIS margin suggest that the GrIS was at or behind its present location at AD ~1650-1800 and ~1816-1889. Results thus far indicate that the GrIS margin in northwestern Greenland responded sensitively to Holocene climate changes. Ongoing research will improve the chronological constraints on these fluctuations.
Evolution of the Marginal Ice Zone: Adaptive Sampling with Autonomous Gliders
2015-09-30
kinetic energy (ε). Gliders also sampled dissolved oxygen, optical backscatter ( chlorophyll and CDOM fluorescence) and multi-spectral downwelling...Fig. 2). In the pack, Pacific Summer Water and a deep chlorophyll maximum form distinct layers at roughly 60 m and 80 m, respectively, which become...Sections across the ice edge just prior to recovery, during freeze-up, reveal elevated chlorophyll fluorescence throughout the mixed layer (Fig. 4
NASA Technical Reports Server (NTRS)
Miller, W. S.
1974-01-01
A structural analysis performed on the 1/4-watt cryogenic refrigerator. The analysis covered the complete assembly except for the cooling jacket and mounting brackets. Maximum stresses, margin of safety, and natural frequencies were calculated for structurally loaded refrigerator components shown in assembly drawings. The stress analysis indicates that the design is satisfactory for the specified vibration environment, and the proof, burst, and normal operating loads.
Maximum entropy approach to statistical inference for an ocean acoustic waveguide.
Knobles, D P; Sagers, J D; Koch, R A
2012-02-01
A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America
Characterization and simulation of cDNA microarray spots using a novel mathematical model
Kim, Hye Young; Lee, Seo Eun; Kim, Min Jung; Han, Jin Il; Kim, Bo Kyung; Lee, Yong Sung; Lee, Young Seek; Kim, Jin Hyuk
2007-01-01
Background The quality of cDNA microarray data is crucial for expanding its application to other research areas, such as the study of gene regulatory networks. Despite the fact that a number of algorithms have been suggested to increase the accuracy of microarray gene expression data, it is necessary to obtain reliable microarray images by improving wet-lab experiments. As the first step of a cDNA microarray experiment, spotting cDNA probes is critical to determining the quality of spot images. Results We developed a governing equation of cDNA deposition during evaporation of a drop in the microarray spotting process. The governing equation included four parameters: the surface site density on the support, the extrapolated equilibrium constant for the binding of cDNA molecules with surface sites on glass slides, the macromolecular interaction factor, and the volume constant of a drop of cDNA solution. We simulated cDNA deposition from the single model equation by varying the value of the parameters. The morphology of the resulting cDNA deposit can be classified into three types: a doughnut shape, a peak shape, and a volcano shape. The spot morphology can be changed into a flat shape by varying the experimental conditions while considering the parameters of the governing equation of cDNA deposition. The four parameters were estimated by fitting the governing equation to the real microarray images. With the results of the simulation and the parameter estimation, the phenomenon of the formation of cDNA deposits in each type was investigated. Conclusion This study explains how various spot shapes can exist and suggests which parameters are to be adjusted for obtaining a good spot. This system is able to explore the cDNA microarray spotting process in a predictable, manageable and descriptive manner. We hope it can provide a way to predict the incidents that can occur during a real cDNA microarray experiment, and produce useful data for several research applications involving cDNA microarrays. PMID:18096047
Mallén, Maria; Díaz-González, María; Bonilla, Diana; Salvador, Juan P; Marco, María P; Baldi, Antoni; Fernández-Sánchez, César
2014-06-17
Low-density protein microarrays are emerging tools in diagnostics whose deployment could be primarily limited by the cost of fluorescence detection schemes. This paper describes an electrical readout system of microarrays comprising an array of gold interdigitated microelectrodes and an array of polydimethylsiloxane microwells, which enabled multiplexed detection of up to thirty six biological events on the same substrate. Similarly to fluorescent readout counterparts, the microarray can be developed on disposable glass slide substrates. However, unlike them, the presented approach is compact and requires a simple and inexpensive instrumentation. The system makes use of urease labeled affinity reagents for developing the microarrays and is based on detection of conductivity changes taking place when ionic species are generated in solution due to the catalytic hydrolysis of urea. The use of a polydimethylsiloxane microwell array facilitates the positioning of the measurement solution on every spot of the microarray. Also, it ensures the liquid tightness and isolation from the surrounding ones during the microarray readout process, thereby avoiding evaporation and chemical cross-talk effects that were shown to affect the sensitivity and reliability of the system. The performance of the system is demonstrated by carrying out the readout of a microarray for boldenone anabolic androgenic steroid hormone. Analytical results are comparable to those obtained by fluorescent scanner detection approaches. The estimated detection limit is 4.0 ng mL(-1), this being below the threshold value set by the World Anti-Doping Agency and the European Community. Copyright © 2014 Elsevier B.V. All rights reserved.
Sevenler, Derin; Daaboul, George G; Ekiz Kanik, Fulya; Ünlü, Neşe Lortlar; Ünlü, M Selim
2018-05-21
DNA and protein microarrays are a high-throughput technology that allow the simultaneous quantification of tens of thousands of different biomolecular species. The mediocre sensitivity and limited dynamic range of traditional fluorescence microarrays compared to other detection techniques have been the technology's Achilles' heel and prevented their adoption for many biomedical and clinical diagnostic applications. Previous work to enhance the sensitivity of microarray readout to the single-molecule ("digital") regime have either required signal amplifying chemistry or sacrificed throughput, nixing the platform's primary advantages. Here, we report the development of a digital microarray which extends both the sensitivity and dynamic range of microarrays by about 3 orders of magnitude. This technique uses functionalized gold nanorods as single-molecule labels and an interferometric scanner which can rapidly enumerate individual nanorods by imaging them with a 10× objective lens. This approach does not require any chemical signal enhancement such as silver deposition and scans arrays with a throughput similar to commercial fluorescence scanners. By combining single-nanoparticle enumeration and ensemble measurements of spots when the particles are very dense, this system achieves a dynamic range of about 6 orders of magnitude directly from a single scan. As a proof-of-concept digital protein microarray assay, we demonstrated detection of hepatitis B virus surface antigen in buffer with a limit of detection of 3.2 pg/mL. More broadly, the technique's simplicity and high-throughput nature make digital microarrays a flexible platform technology with a wide range of potential applications in biomedical research and clinical diagnostics.
Burgarella, Sarah; Cattaneo, Dario; Pinciroli, Francesco; Masseroli, Marco
2005-12-01
Improvements of bio-nano-technologies and biomolecular techniques have led to increasing production of high-throughput experimental data. Spotted cDNA microarray is one of the most diffuse technologies, used in single research laboratories and in biotechnology service facilities. Although they are routinely performed, spotted microarray experiments are complex procedures entailing several experimental steps and actors with different technical skills and roles. During an experiment, involved actors, who can also be located in a distance, need to access and share specific experiment information according to their roles. Furthermore, complete information describing all experimental steps must be orderly collected to allow subsequent correct interpretation of experimental results. We developed MicroGen, a web system for managing information and workflow in the production pipeline of spotted microarray experiments. It is constituted of a core multi-database system able to store all data completely characterizing different spotted microarray experiments according to the Minimum Information About Microarray Experiments (MIAME) standard, and of an intuitive and user-friendly web interface able to support the collaborative work required among multidisciplinary actors and roles involved in spotted microarray experiment production. MicroGen supports six types of user roles: the researcher who designs and requests the experiment, the spotting operator, the hybridisation operator, the image processing operator, the system administrator, and the generic public user who can access the unrestricted part of the system to get information about MicroGen services. MicroGen represents a MIAME compliant information system that enables managing workflow and supporting collaborative work in spotted microarray experiment production.
DNA Microarray Detection of 18 Important Human Blood Protozoan Species
Chen, Jun-Hu; Feng, Xin-Yu; Chen, Shao-Hong; Cai, Yu-Chun; Lu, Yan; Zhou, Xiao-Nong; Chen, Jia-Xu; Hu, Wei
2016-01-01
Background Accurate detection of blood protozoa from clinical samples is important for diagnosis, treatment and control of related diseases. In this preliminary study, a novel DNA microarray system was assessed for the detection of Plasmodium, Leishmania, Trypanosoma, Toxoplasma gondii and Babesia in humans, animals, and vectors, in comparison with microscopy and PCR data. Developing a rapid, simple, and convenient detection method for protozoan detection is an urgent need. Methodology/Principal Findings The microarray assay simultaneously identified 18 species of common blood protozoa based on the differences in respective target genes. A total of 20 specific primer pairs and 107 microarray probes were selected according to conserved regions which were designed to identify 18 species in 5 blood protozoan genera. The positive detection rate of the microarray assay was 91.78% (402/438). Sensitivity and specificity for blood protozoan detection ranged from 82.4% (95%CI: 65.9% ~ 98.8%) to 100.0% and 95.1% (95%CI: 93.2% ~ 97.0%) to 100.0%, respectively. Positive predictive value (PPV) and negative predictive value (NPV) ranged from 20.0% (95%CI: 2.5% ~ 37.5%) to 100.0% and 96.8% (95%CI: 95.0% ~ 98.6%) to 100.0%, respectively. Youden index varied from 0.82 to 0.98. The detection limit of the DNA microarrays ranged from 200 to 500 copies/reaction, similar to PCR findings. The concordance rate between microarray data and DNA sequencing results was 100%. Conclusions/Significance Overall, the newly developed microarray platform provides a convenient, highly accurate, and reliable clinical assay for the determination of blood protozoan species. PMID:27911895
Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.
2016-01-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567
McCulloh, Katherine A; Johnson, Daniel M; Petitmermet, Joshua; McNellis, Brandon; Meinzer, Frederick C; Lachenbruch, Barbara
2015-07-01
The physiological mechanisms underlying the short maximum height of shrubs are not understood. One possible explanation is that differences in the hydraulic architecture of shrubs compared with co-occurring taller trees prevent the shrubs from growing taller. To explore this hypothesis, we examined various hydraulic parameters, including vessel lumen diameter, hydraulic conductivity and vulnerability to drought-induced embolism, of three co-occurring species that differed in their maximum potential height. We examined one species of shrub, one short-statured tree and one taller tree. We worked with individuals that were approximately the same age and height, which was near the maximum for the shrub species. A number of variables correlated with the maximum potential height of the species. For example, vessel diameter and vulnerability to embolism both increased while wood density declined with maximum potential height. The difference between the pressure causing 50% reduction in hydraulic conductance in the leaves and the midday leaf water potential (the leaf's hydraulic safety margin) was much larger in the shrub than the other two species. In general, trends were consistent with understory shrubs having a more conservative life history strategy than co-occurring taller species. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Parthasarathy, Narayanan; DeShazer, David; England, Marilyn; Waag, David M
2006-11-01
A polysaccharide microarray platform was prepared by immobilizing Burkholderia pseudomallei and Burkholderia mallei polysaccharides. This polysaccharide array was tested with success for detecting B. pseudomallei and B. mallei serum (human and animal) antibodies. The advantages of this microarray technology over the current serodiagnosis of the above bacterial infections were discussed.
EDRN Biomarker Reference Lab: Pacific Northwest National Laboratory — EDRN Public Portal
The purpose of this project is to develop antibody microarrays incorporating three major improvements compared to previous antibody microarray platforms, and to produce and disseminate these antibody microarray technologies for the Early Detection Research Network (EDRN) and the research community focusing on early detection, and risk assessment of cancer.
DEVELOPMENT AND VALIDATION OF A 2,000 GENE MICROARRAY FOR THE FATHEAD MINNOW, PIMEPHALES PROMELAS
The development of the gene microarray has provided the field of ecotoxicology a new tool to identify modes of action (MOA) of chemicals and chemical mixtures. Herein we describe the development and application of a 2,000 gene oligonucleotide microarray for the fathead minnow (P...
Fabrication of Carbohydrate Microarrays by Boronate Formation.
Adak, Avijit K; Lin, Ting-Wei; Li, Ben-Yuan; Lin, Chun-Cheng
2017-01-01
The interactions between soluble carbohydrates and/or surface displayed glycans and protein receptors are essential to many biological processes and cellular recognition events. Carbohydrate microarrays provide opportunities for high-throughput quantitative analysis of carbohydrate-protein interactions. Over the past decade, various techniques have been implemented for immobilizing glycans on solid surfaces in a microarray format. Herein, we describe a detailed protocol for fabricating carbohydrate microarrays that capitalizes on the intrinsic reactivity of boronic acid toward carbohydrates to form stable boronate diesters. A large variety of unprotected carbohydrates ranging in structure from simple disaccharides and trisaccharides to considerably more complex human milk and blood group (oligo)saccharides have been covalently immobilized in a single step on glass slides, which were derivatized with high-affinity boronic acid ligands. The immobilized ligands in these microarrays maintain the receptor-binding activities including those of lectins and antibodies according to the structures of their pendant carbohydrates for rapid analysis of a number of carbohydrate-recognition events within 30 h. This method facilitates the direct construction of otherwise difficult to obtain carbohydrate microarrays from underivatized glycans.
Advanced Power Conditioning System
NASA Technical Reports Server (NTRS)
Johnson, N. L.
1971-01-01
The second portion of the advanced power conditioning system development program is reported. Five 100-watt parallel power stages with majority-vote-logic feedback-regulator were breadboarded and tested to the design goals. The input voltage range was 22.1 to 57.4 volts at loads from zero to 500 watts. The maximum input ripple current was 200 mA pk-pk (not including spikes) at 511 watts load; the output voltage was 56V dc with a maximum change of 0.89 volts for all variations of line, load, and temperature; the maximum output ripple was 320 mV pk-pk at 512 watts load (dependent on filter capacitance value); the maximum efficiency was 93.9% at 212 watts and 50V dc input; the minimum efficiency was 87.2% at 80-watt load and 50V dc input; the efficiency was above 90% from 102 watts to 372 watts; the maximum excursion for an 80-watt load change was 2.1 volts with a recovery time of 7 milliseconds; and the unit performed within regulation limits from -20 C to +85 C. During the test sequence, margin tests and failure mode tests were run with no resulting degradation in performance.
A study of longitudinal tumor motion in helical tomotherapy using a cylindrical phantom
Klein, Michael; Gaede, Stewart
2013-01-01
Tumor motion during radiation treatment on a helical tomotherapy unit may create problems due to interplay with motion of the multileaf collimator, gantry rotation, and patient couch translation through the gantry. This study evaluated this interplay effect for typical clinical parameters using a cylindrical phantom consisting of 1386 diode detectors placed on a respiratory motion platform. All combinations of radiation field widths (1, 2.5, and 5 cm) and gantry rotation periods (16, 30, and 60 s) were considered for sinusoidal motions with a period of 4 s and amplitudes of 5, 6, 7, 8, 9, and 10 mm, as well as real patient breathing pattern. Gamma comparisons with 2% dose difference and 2 mm distance to agreement and dose profiles were used for evaluation. The required motion margins were determined for each set of parameters. The required margin size increased with decreasing field width and increasing tumor motion amplitude, but was not affected by rotation period. The plans with the smallest field width of 1 cm have required motion margins approximately equal to the amplitude of motion (±25%), while those with the largest field width of 5 cm had required motion margins approximately equal to 20% of the motion amplitude (±20%). For tumor motion amplitudes below 6 mm and field widths above 1 cm, the required additional motion margins were very small, at a maximum of 2.5 mm for sinusoidal breathing patterns and 1.2 mm for the real patient breathing pattern. PACS numbers: 87.55.km, 87.55.Qr, 87.56.Fc
NASA Astrophysics Data System (ADS)
Mojtahid, M.; Toucanne, S.; Fentimen, R.; Barras, C.; Le Houedec, S.; Soulet, G.; Bourillet, J.-F.; Michel, E.
2017-11-01
Using benthic foraminiferal-based proxies in sediments from the Celtic margin, we provide a well-dated record across the last deglaciation of the Channel River dynamics and its potential impact on the hydrology of intermediate water masses along the European margin. Our results describe three main periods: 1) During the Last Glacial Maximum, and before ∼21 ka BP, the predominance of meso-oligotrophic species suggests well oxygenated water masses. After ∼21 ka BP, increasing proportions of eutrophic species related to enhanced riverine supply occurs concomitantly with early warming in Greenland air-temperatures; 2) A thick laminated deposit, occurring during a 1500-years long period of seasonal melting of the European Ice Sheet (EIS), is associated with early Heinrich Stadial 1 period (∼18.2-16.7 ka BP). The benthic proxies describe low salinity episodes, cold temperatures, severe dysoxia and eutrophic conditions on the sea floor, perhaps evidence for cascading of turbid meltwaters; 3) During late HS1 (∼16.7-14.7 ka BP), conditions on the Celtic margin's seafloor changed drastically and faunas indicate oligotrophic conditions as a result of the ceasing of EIS meltwater discharges. While surface waters were cold due to Laurentide Ice Sheet (LIS) icebergs releases, increasing benthic Mg/Ca ratios reveal a progressive warming of intermediate water masses whereas oxygen proxies indicate overall well oxygenated conditions. In addition to the well known effect of EIS meltwaters on surface waters in the Celtic margin, our benthic record documents a pronounced impact on intermediate water depths during HS1, which coincided with major AMOC disruptions.
NASA Astrophysics Data System (ADS)
Vesely, Fernando F.; Trzaskos, Barbara; Kipper, Felipe; Assine, Mario Luis; Souza, Paulo A.
2015-08-01
The Paraná Basin is a key locality in the context of the Late Paleozoic Ice Age (LPIA) because of its location east of the Andean proto-margin of Gondwana and west of contiguous interior basins today found in western Africa. In this paper we document the sedimentary record associated with an ice margin that reached the eastern border of the Paraná Basin during the Pennsylvanian, with the aim of interpreting the depositional environments and discussing paleogeographic implications. The examined stratigraphic succession is divided in four stacked facies associations that record an upward transition from subglacial to glaciomarine environments. Deposition took place during deglaciation but was punctuated by minor readvances of the ice margin that deformed the sediment pile. Tillites, well-preserved landforms of subglacial erosion and glaciotectonic deformational structures indicate that the ice flowed to the north and northwest and that the ice margin did not advance far throughout the basin during the glacial maximum. Consequently, time-equivalent glacial deposits that crop out in other localities of eastern Paraná Basin are better explained by assuming multiple smaller ice lobes instead of one single large glacier. These ice lobes flowed from an ice cap covering uplifted lands now located in western Namibia, where glacial deposits are younger and occur confined within paleovalleys cut onto the Precambrian basement. This conclusion corroborates the idea of a topographically-controlled ice-spreading center in southwestern Africa and does not support the view of a large polar ice sheet controlling deposition in the Paraná Basin during the LPIA.
Signal amplification by rolling circle amplification on DNA microarrays
Nallur, Girish; Luo, Chenghua; Fang, Linhua; Cooley, Stephanie; Dave, Varshal; Lambert, Jeremy; Kukanskis, Kari; Kingsmore, Stephen; Lasken, Roger; Schweitzer, Barry
2001-01-01
While microarrays hold considerable promise in large-scale biology on account of their massively parallel analytical nature, there is a need for compatible signal amplification procedures to increase sensitivity without loss of multiplexing. Rolling circle amplification (RCA) is a molecular amplification method with the unique property of product localization. This report describes the application of RCA signal amplification for multiplexed, direct detection and quantitation of nucleic acid targets on planar glass and gel-coated microarrays. As few as 150 molecules bound to the surface of microarrays can be detected using RCA. Because of the linear kinetics of RCA, nucleic acid target molecules may be measured with a dynamic range of four orders of magnitude. Consequently, RCA is a promising technology for the direct measurement of nucleic acids on microarrays without the need for a potentially biasing preamplification step. PMID:11726701
Cell-Based Microarrays for In Vitro Toxicology
NASA Astrophysics Data System (ADS)
Wegener, Joachim
2015-07-01
DNA/RNA and protein microarrays have proven their outstanding bioanalytical performance throughout the past decades, given the unprecedented level of parallelization by which molecular recognition assays can be performed and analyzed. Cell microarrays (CMAs) make use of similar construction principles. They are applied to profile a given cell population with respect to the expression of specific molecular markers and also to measure functional cell responses to drugs and chemicals. This review focuses on the use of cell-based microarrays for assessing the cytotoxicity of drugs, toxins, or chemicals in general. It also summarizes CMA construction principles with respect to the cell types that are used for such microarrays, the readout parameters to assess toxicity, and the various formats that have been established and applied. The review ends with a critical comparison of CMAs and well-established microtiter plate (MTP) approaches.
The use of open source bioinformatics tools to dissect transcriptomic data.
Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera
2012-01-01
Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.
Zhang, Zhaowei; Li, Peiwu; Hu, Xiaofeng; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen
2012-01-01
Chemical contaminants in food have caused serious health issues in both humans and animals. Microarray technology is an advanced technique suitable for the analysis of chemical contaminates. In particular, immuno-microarray approach is one of the most promising methods for chemical contaminants analysis. The use of microarrays for the analysis of chemical contaminants is the subject of this review. Fabrication strategies and detection methods for chemical contaminants are discussed in detail. Application to the analysis of mycotoxins, biotoxins, pesticide residues, and pharmaceutical residues is also described. Finally, future challenges and opportunities are discussed.
Enhancing Results of Microarray Hybridizations Through Microagitation
Toegl, Andreas; Kirchner, Roland; Gauer, Christoph; Wixforth, Achim
2003-01-01
Protein and DNA microarrays have become a standard tool in proteomics/genomics research. In order to guarantee fast and reproducible hybridization results, the diffusion limit must be overcome. Surface acoustic wave (SAW) micro-agitation chips efficiently agitate the smallest sample volumes (down to 10 μL and below) without introducing any dead volume. The advantages are reduced reaction time, increased signal-to-noise ratio, improved homogeneity across the microarray, and better slide-to-slide reproducibility. The SAW micromixer chips are the heart of the Advalytix ArrayBooster, which is compatible with all microarrays based on the microscope slide format. PMID:13678150
AFM 4.0: a toolbox for DNA microarray analysis
Breitkreutz, Bobby-Joe; Jorgensen, Paul; Breitkreutz, Ashton; Tyers, Mike
2001-01-01
We have developed a series of programs, collectively packaged as Array File Maker 4.0 (AFM), that manipulate and manage DNA microarray data. AFM 4.0 is simple to use, applicable to any organism or microarray, and operates within the familiar confines of Microsoft Excel. Given a database of expression ratios, AFM 4.0 generates input files for clustering, helps prepare colored figures and Venn diagrams, and can uncover aneuploidy in yeast microarray data. AFM 4.0 should be especially useful to laboratories that do not have access to specialized commercial or in-house software. PMID:11532221
Progress in the application of DNA microarrays.
Lobenhofer, E K; Bushel, P R; Afshari, C A; Hamadeh, H K
2001-01-01
Microarray technology has been applied to a variety of different fields to address fundamental research questions. The use of microarrays, or DNA chips, to study the gene expression profiles of biologic samples began in 1995. Since that time, the fundamental concepts behind the chip, the technology required for making and using these chips, and the multitude of statistical tools for analyzing the data have been extensively reviewed. For this reason, the focus of this review will be not on the technology itself but on the application of microarrays as a research tool and the future challenges of the field. PMID:11673116
Issues in the analysis of oligonucleotide tiling microarrays for transcript mapping
NASA Technical Reports Server (NTRS)
Royce, Thomas E.; Rozowsky, Joel S.; Bertone, Paul; Samanta, Manoj; Stolc, Viktor; Weissman, Sherman; Snyder, Michael; Gerstein, Mark
2005-01-01
Traditional microarrays use probes complementary to known genes to quantitate the differential gene expression between two or more conditions. Genomic tiling microarray experiments differ in that probes that span a genomic region at regular intervals are used to detect the presence or absence of transcription. This difference means the same sets of biases and the methods for addressing them are unlikely to be relevant to both types of experiment. We introduce the informatics challenges arising in the analysis of tiling microarray experiments as open problems to the scientific community and present initial approaches for the analysis of this nascent technology.
Denion, Eric; Hitier, Martin; Levieil, Eric; Mouriaux, Frédéric
2015-01-01
While convergent, the human orbit differs from that of non-human apes in that its lateral orbital margin is significantly more rearward. This rearward position does not obstruct the additional visual field gained through eye motion. This additional visual field is therefore considered to be wider in humans than in non-human apes. A mathematical model was designed to quantify this difference. The mathematical model is based on published computed tomography data in the human neuro-ocular plane (NOP) and on additional anatomical data from 100 human skulls and 120 non-human ape skulls (30 gibbons; 30 chimpanzees / bonobos; 30 orangutans; 30 gorillas). It is used to calculate temporal visual field eccentricity values in the NOP first in the primary position of gaze then for any eyeball rotation value in abduction up to 45° and any lateral orbital margin position between 85° and 115° relative to the sagittal plane. By varying the lateral orbital margin position, the human orbit can be made “non-human ape-like”. In the Pan-like orbit, the orbital margin position (98.7°) was closest to the human orbit (107.1°). This modest 8.4° difference resulted in a large 21.1° difference in maximum lateral visual field eccentricity with eyeball abduction (Pan-like: 115°; human: 136.1°). PMID:26190625
Analysis of seaweed marketing in warbal village, Southeast Maluku Regency, Indonesia
NASA Astrophysics Data System (ADS)
Tumiwa, Bruri B.; Renjaan, Meiskyana R.; B. A Somnaikubun, Glen; Betauubun, Kamilius D.; Hungan, Marselus
2017-10-01
Seaweed in Warbal Village, West Kei Kecil Subdistrict, Southeast Maluku Regency has prospects and business opportunities are adequate to give hope to farmers in improving welfare. The fact that seaweed farming has not yet provided better and maximum results as desired by the farmers. This study aims to evaluation the marketing channels, marketing margins and profit share of marketing agencies. The research is located in Warbal Village, West Kei Kecil Subdistrict, Southeast Maluku Regency which is determined purposively. The number of sample is 30 farmers taken by simple random sampling, 2 wholesaler traders and 2 collector traders taken by using snowball method. The data collection methods is interview and questionnaire directly to farmers and marketing agencies, literary method or data collector from institutions related to the research’s aims. The research results show that there is two marketing channel, as follows: Channel I: farmers, wholesaler traders, collector traders, PAP; Channel II: farmers, collector traders, PAP. The magnitude of marketing margins is different between the marketing channels, and so it is with profit share of a marketing agency. On channel I, magnitude margin is IDR 3,250 and profit share is 71.11% on farmers, 17.76% on wholesaler traders and 11.09% on collector traders. On channel II, the magnitude of marketing margin is IDR 1,250 and profit share is 88.88% on farmers and 11.09% to collector traders.
Shrinkage regression-based methods for microarray missing value imputation.
Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng
2013-01-01
Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.
Arora, Sheen Juneja; Arora, Aman; Upadhyaya, Viram; Jain, Shilpi
2016-01-01
Background or Statement of Problem: As, the longevity of provisional restorations is related to, a perfect adaptation and a strong, long-term union between restoration and teeth structures, therefore, evaluation of marginal leakage of provisional restorative materials luted with cements using the standardized procedures is essential. Aims and Objectives: To compare the marginal leakage of the provisional crowns fabricated from Autopolymerizing acrylic resin crowns and bisphenol A-glycidyl dimethacrylate (BIS-GMA) resin crowns. To compare the marginal leakage of the provisional crowns fabricated from autopolymerizing acrylic resin crowns and BIS-GMA resin crowns cemented with different temporary luting cements. To compare the marginal leakage of the provisional crowns fabricated from autopolymerizing acrylic resin (SC-10) crowns cemented with different temporary luting cements. To compare the marginal leakage of the provisional crowns fabricated from BIS-GMA resin crowns (Protemp 4) cemented with different temporary luting cements. Methodology: Freshly extracted 60 maxillary premolars of approximately similar dimensions were mounted in dental plaster. Tooth reduction with shoulder margin was planned to use a customized handpiece-holding jig. Provisional crowns were prepared using the wax pattern fabricated from computer aided designing/computer aided manufacturing milling machine following the tooth preparation. Sixty provisional crowns were made, thirty each of SC-10 and Protemp 4 and were then cemented with three different luting cements. Specimens were thermocycled, submerged in a 2% methylene blue solution, then sectioned and observed under a stereomicroscope for the evaluation of marginal microleakage. A five-level scale was used to score dye penetration in the tooth/cement interface and the results of this study was analyzed using the Chi-square test, Mann–Whitney U-test, Kruskal–Wallis H-test and the results were statistically significant P < 0.05 the power of study - 80%. Results: Marginal leakage was significant in both provisional crowns cemented with three different luting cements along the axial walls of teeth (P < 0.05) confidence interval - 95%. Conclusion: The temporary cements with eugenol showed more microleakage than those without eugenol. SC-10 crowns showed more microleakage compared to Protemp 4 crowns. SC-10 crowns cemented with Kalzinol showed maximum microleakage and Protemp 4 crowns cemented with HY bond showed least microleakage. PMID:27134427
2004-01-01
of RNA From Peripheral Blood Cells: A Validation Study for Molecular Diagnostics by Microarray and Kinetic RT-PCR Assays Application in...VALIDATION STUDY FOR MOLECULAR DIAGNOSTICS BY MICROARRAY AND KINETIC RT-PCR ASSAYS APPLICATION IN AEROSPACE MEDICINE INTRODUCTION Extraction of cellular
Microarray profiling of chemical-induced effects is being increasingly used in medium and high-throughput formats. In this study, we describe computational methods to identify molecular targets from whole-genome microarray data using as an example the estrogen receptor α (ERα), ...
Chromosomal Microarray versus Karyotyping for Prenatal Diagnosis
Wapner, Ronald J.; Martin, Christa Lese; Levy, Brynn; Ballif, Blake C.; Eng, Christine M.; Zachary, Julia M.; Savage, Melissa; Platt, Lawrence D.; Saltzman, Daniel; Grobman, William A.; Klugman, Susan; Scholl, Thomas; Simpson, Joe Leigh; McCall, Kimberly; Aggarwal, Vimla S.; Bunke, Brian; Nahum, Odelia; Patel, Ankita; Lamb, Allen N.; Thom, Elizabeth A.; Beaudet, Arthur L.; Ledbetter, David H.; Shaffer, Lisa G.; Jackson, Laird
2013-01-01
Background Chromosomal microarray analysis has emerged as a primary diagnostic tool for the evaluation of developmental delay and structural malformations in children. We aimed to evaluate the accuracy, efficacy, and incremental yield of chromosomal microarray analysis as compared with karyotyping for routine prenatal diagnosis. Methods Samples from women undergoing prenatal diagnosis at 29 centers were sent to a central karyotyping laboratory. Each sample was split in two; standard karyotyping was performed on one portion and the other was sent to one of four laboratories for chromosomal microarray. Results We enrolled a total of 4406 women. Indications for prenatal diagnosis were advanced maternal age (46.6%), abnormal result on Down’s syndrome screening (18.8%), structural anomalies on ultrasonography (25.2%), and other indications (9.4%). In 4340 (98.8%) of the fetal samples, microarray analysis was successful; 87.9% of samples could be used without tissue culture. Microarray analysis of the 4282 nonmosaic samples identified all the aneuploidies and unbalanced rearrangements identified on karyotyping but did not identify balanced translocations and fetal triploidy. In samples with a normal karyotype, microarray analysis revealed clinically relevant deletions or duplications in 6.0% with a structural anomaly and in 1.7% of those whose indications were advanced maternal age or positive screening results. Conclusions In the context of prenatal diagnostic testing, chromosomal microarray analysis identified additional, clinically significant cytogenetic information as compared with karyotyping and was equally efficacious in identifying aneuploidies and unbalanced rearrangements but did not identify balanced translocations and triploidies. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and others; ClinicalTrials.gov number, NCT01279733.) PMID:23215555
The MGED Ontology: a resource for semantics-based description of microarray experiments.
Whetzel, Patricia L; Parkinson, Helen; Causton, Helen C; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Game, Laurence; Heiskanen, Mervi; Morrison, Norman; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Taylor, Chris; White, Joseph; Stoeckert, Christian J
2006-04-01
The generation of large amounts of microarray data and the need to share these data bring challenges for both data management and annotation and highlights the need for standards. MIAME specifies the minimum information needed to describe a microarray experiment and the Microarray Gene Expression Object Model (MAGE-OM) and resulting MAGE-ML provide a mechanism to standardize data representation for data exchange, however a common terminology for data annotation is needed to support these standards. Here we describe the MGED Ontology (MO) developed by the Ontology Working Group of the Microarray Gene Expression Data (MGED) Society. The MO provides terms for annotating all aspects of a microarray experiment from the design of the experiment and array layout, through to the preparation of the biological sample and the protocols used to hybridize the RNA and analyze the data. The MO was developed to provide terms for annotating experiments in line with the MIAME guidelines, i.e. to provide the semantics to describe a microarray experiment according to the concepts specified in MIAME. The MO does not attempt to incorporate terms from existing ontologies, e.g. those that deal with anatomical parts or developmental stages terms, but provides a framework to reference terms in other ontologies and therefore facilitates the use of ontologies in microarray data annotation. The MGED Ontology version.1.2.0 is available as a file in both DAML and OWL formats at http://mged.sourceforge.net/ontologies/index.php. Release notes and annotation examples are provided. The MO is also provided via the NCICB's Enterprise Vocabulary System (http://nciterms.nci.nih.gov/NCIBrowser/Dictionary.do). Stoeckrt@pcbi.upenn.edu Supplementary data are available at Bioinformatics online.
Pine, P S; Boedigheimer, M; Rosenzweig, B A; Turpaz, Y; He, Y D; Delenstarr, G; Ganter, B; Jarnagin, K; Jones, W D; Reid, L H; Thompson, K L
2008-11-01
Effective use of microarray technology in clinical and regulatory settings is contingent on the adoption of standard methods for assessing performance. The MicroArray Quality Control project evaluated the repeatability and comparability of microarray data on the major commercial platforms and laid the groundwork for the application of microarray technology to regulatory assessments. However, methods for assessing performance that are commonly applied to diagnostic assays used in laboratory medicine remain to be developed for microarray assays. A reference system for microarray performance evaluation and process improvement was developed that includes reference samples, metrics and reference datasets. The reference material is composed of two mixes of four different rat tissue RNAs that allow defined target ratios to be assayed using a set of tissue-selective analytes that are distributed along the dynamic range of measurement. The diagnostic accuracy of detected changes in expression ratios, measured as the area under the curve from receiver operating characteristic plots, provides a single commutable value for comparing assay specificity and sensitivity. The utility of this system for assessing overall performance was evaluated for relevant applications like multi-laboratory proficiency testing programs and single-laboratory process drift monitoring. The diagnostic accuracy of detection of a 1.5-fold change in signal level was found to be a sensitive metric for comparing overall performance. This test approaches the technical limit for reliable discrimination of differences between two samples using this technology. We describe a reference system that provides a mechanism for internal and external assessment of laboratory proficiency with microarray technology and is translatable to performance assessments on other whole-genome expression arrays used for basic and clinical research.
A meta-data based method for DNA microarray imputation.
Jörnsten, Rebecka; Ouyang, Ming; Wang, Hui-Yu
2007-03-29
DNA microarray experiments are conducted in logical sets, such as time course profiling after a treatment is applied to the samples, or comparisons of the samples under two or more conditions. Due to cost and design constraints of spotted cDNA microarray experiments, each logical set commonly includes only a small number of replicates per condition. Despite the vast improvement of the microarray technology in recent years, missing values are prevalent. Intuitively, imputation of missing values is best done using many replicates within the same logical set. In practice, there are few replicates and thus reliable imputation within logical sets is difficult. However, it is in the case of few replicates that the presence of missing values, and how they are imputed, can have the most profound impact on the outcome of downstream analyses (e.g. significance analysis and clustering). This study explores the feasibility of imputation across logical sets, using the vast amount of publicly available microarray data to improve imputation reliability in the small sample size setting. We download all cDNA microarray data of Saccharomyces cerevisiae, Arabidopsis thaliana, and Caenorhabditis elegans from the Stanford Microarray Database. Through cross-validation and simulation, we find that, for all three species, our proposed imputation using data from public databases is far superior to imputation within a logical set, sometimes to an astonishing degree. Furthermore, the imputation root mean square error for significant genes is generally a lot less than that of non-significant ones. Since downstream analysis of significant genes, such as clustering and network analysis, can be very sensitive to small perturbations of estimated gene effects, it is highly recommended that researchers apply reliable data imputation prior to further analysis. Our method can also be applied to cDNA microarray experiments from other species, provided good reference data are available.
Goodman, Corey W; Major, Heather J; Walls, William D; Sheffield, Val C; Casavant, Thomas L; Darbro, Benjamin W
2015-04-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. Copyright © 2015 Elsevier Inc. All rights reserved.
Construction of a cDNA microarray derived from the ascidian Ciona intestinalis.
Azumi, Kaoru; Takahashi, Hiroki; Miki, Yasufumi; Fujie, Manabu; Usami, Takeshi; Ishikawa, Hisayoshi; Kitayama, Atsusi; Satou, Yutaka; Ueno, Naoto; Satoh, Nori
2003-10-01
A cDNA microarray was constructed from a basal chordate, the ascidian Ciona intestinalis. The draft genome of Ciona has been read and inferred to contain approximately 16,000 protein-coding genes, and cDNAs for transcripts of 13,464 genes have been characterized and compiled as the "Ciona intestinalis Gene Collection Release I". In the present study, we constructed a cDNA microarray of these 13,464 Ciona genes. A preliminary experiment with Cy3- and Cy5-labeled probes showed extensive differential gene expression between fertilized eggs and larvae. In addition, there was a good correlation between results obtained by the present microarray analysis and those from previous EST analyses. This first microarray of a large collection of Ciona intestinalis cDNA clones should facilitate the analysis of global gene expression and gene networks during the embryogenesis of basal chordates.
Improvement in the amine glass platform by bubbling method for a DNA microarray
Jee, Seung Hyun; Kim, Jong Won; Lee, Ji Hyeong; Yoon, Young Soo
2015-01-01
A glass platform with high sensitivity for sexually transmitted diseases microarray is described here. An amino-silane-based self-assembled monolayer was coated on the surface of a glass platform using a novel bubbling method. The optimized surface of the glass platform had highly uniform surface modifications using this method, as well as improved hybridization properties with capture probes in the DNA microarray. On the basis of these results, the improved glass platform serves as a highly reliable and optimal material for the DNA microarray. Moreover, in this study, we demonstrated that our glass platform, manufactured by utilizing the bubbling method, had higher uniformity, shorter processing time, lower background signal, and higher spot signal than the platforms manufactured by the general dipping method. The DNA microarray manufactured with a glass platform prepared using bubbling method can be used as a clinical diagnostic tool. PMID:26468293
Improvement in the amine glass platform by bubbling method for a DNA microarray.
Jee, Seung Hyun; Kim, Jong Won; Lee, Ji Hyeong; Yoon, Young Soo
2015-01-01
A glass platform with high sensitivity for sexually transmitted diseases microarray is described here. An amino-silane-based self-assembled monolayer was coated on the surface of a glass platform using a novel bubbling method. The optimized surface of the glass platform had highly uniform surface modifications using this method, as well as improved hybridization properties with capture probes in the DNA microarray. On the basis of these results, the improved glass platform serves as a highly reliable and optimal material for the DNA microarray. Moreover, in this study, we demonstrated that our glass platform, manufactured by utilizing the bubbling method, had higher uniformity, shorter processing time, lower background signal, and higher spot signal than the platforms manufactured by the general dipping method. The DNA microarray manufactured with a glass platform prepared using bubbling method can be used as a clinical diagnostic tool.
Prediction of regulatory gene pairs using dynamic time warping and gene ontology.
Yang, Andy C; Hsu, Hui-Huang; Lu, Ming-Da; Tseng, Vincent S; Shih, Timothy K
2014-01-01
Selecting informative genes is the most important task for data analysis on microarray gene expression data. In this work, we aim at identifying regulatory gene pairs from microarray gene expression data. However, microarray data often contain multiple missing expression values. Missing value imputation is thus needed before further processing for regulatory gene pairs becomes possible. We develop a novel approach to first impute missing values in microarray time series data by combining k-Nearest Neighbour (KNN), Dynamic Time Warping (DTW) and Gene Ontology (GO). After missing values are imputed, we then perform gene regulation prediction based on our proposed DTW-GO distance measurement of gene pairs. Experimental results show that our approach is more accurate when compared with existing missing value imputation methods on real microarray data sets. Furthermore, our approach can also discover more regulatory gene pairs that are known in the literature than other methods.
Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto
2008-03-01
The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.
Temperature Gradient Effect on Gas Discrimination Power of a Metal-Oxide Thin-Film Sensor Microarray
Sysoev, Victor V.; Kiselev, Ilya; Frietsch, Markus; Goschnick, Joachim
2004-01-01
The paper presents results concerning the effect of spatial inhomogeneous operating temperature on the gas discrimination power of a gas-sensor microarray, with the latter based on a thin SnO2 film employed in the KAMINA electronic nose. Three different temperature distributions over the substrate are discussed: a nearly homogeneous one and two temperature gradients, equal to approx. 3.3 °C/mm and 6.7 °C/mm, applied across the sensor elements (segments) of the array. The gas discrimination power of the microarray is judged by using the Mahalanobis distance in the LDA (Linear Discrimination Analysis) coordinate system between the data clusters obtained by the response of the microarray to four target vapors: ethanol, acetone, propanol and ammonia. It is shown that the application of a temperature gradient increases the gas discrimination power of the microarray by up to 35 %.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gentry, T.; Schadt, C.; Zhou, J.
Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. Thismore » review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.« less
Chondrocyte channel transcriptomics
Lewis, Rebecca; May, Hannah; Mobasheri, Ali; Barrett-Jolley, Richard
2013-01-01
To date, a range of ion channels have been identified in chondrocytes using a number of different techniques, predominantly electrophysiological and/or biomolecular; each of these has its advantages and disadvantages. Here we aim to compare and contrast the data available from biophysical and microarray experiments. This letter analyses recent transcriptomics datasets from chondrocytes, accessible from the European Bioinformatics Institute (EBI). We discuss whether such bioinformatic analysis of microarray datasets can potentially accelerate identification and discovery of ion channels in chondrocytes. The ion channels which appear most frequently across these microarray datasets are discussed, along with their possible functions. We discuss whether functional or protein data exist which support the microarray data. A microarray experiment comparing gene expression in osteoarthritis and healthy cartilage is also discussed and we verify the differential expression of 2 of these genes, namely the genes encoding large calcium-activated potassium (BK) and aquaporin channels. PMID:23995703
Antipodal hotspot pairs on the earth
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Caldeira, Ken
1992-01-01
The results of statistical analyses performed on three published hotspot distributions suggest that significantly more hotspots occur as nearly antipodal pairs than is anticipated from a random distribution, or from their association with geoid highs and divergent plate margins. The observed number of antipodal hotspot pairs depends on the maximum allowable deviation from exact antipodality. At a maximum deviation of not greater than 700 km, 26 to 37 percent of hotspots form antipodal pairs in the published lists examined here, significantly more than would be expected from the general hotspot distribution. Two possible mechanisms that might create such a distribution include: (1) symmetry in the generation of mantle plumes, and (2) melting related to antipodal focusing of seismic energy from large-body impacts.
Lezon, Timothy R; Banavar, Jayanth R; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V
2006-12-12
We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems.
NASA Astrophysics Data System (ADS)
Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.
2012-11-01
We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which results in a margin stratigraphy of minor MTDs compared to the turbidite-system deposits. In contrast, the MTDs and turbidites are equally intermixed on basin floors along passive margins with a mud-rich continental slope, such as the northern Gulf of Mexico. Great earthquakes also result in characteristic seismo-turbidite lithology. Along the Cascadia margin, the number and character of multiple coarse pulses for correlative individual turbidites generally remain constant both upstream and downstream in different channel systems for 600 km along the margin. This suggests that the earthquake shaking or aftershock signature is normally preserved, for the stronger (Mw ≥ 9) Cascadia earthquakes. In contrast, the generally weaker (Mw = or <8) California earthquakes result in upstream simple fining-up turbidites in single tributary canyons and channels; however, downstream mainly stacked turbidites result from synchronously triggered multiple turbidity currents that deposit in channels below confluences of the tributaries. Consequently, both downstream channel confluences and the strongest (Mw ≥ 9) great earthquakes contribute to multi-pulsed and stacked turbidites that are typical for seismo-turbidites generated by a single great earthquake. Earthquake triggering and multi-pulsed or stacked turbidites also become an alternative explanation for amalgamated turbidite beds in active tectonic margins, in addition to other classic explanations. The sedimentologic characteristics of turbidites triggered by great earthquakes along the Cascadia and northern California margins provide criteria to help distinguish seismo-turbidites in other active tectonic margins.
The dynamics of continental breakup-related magmatism on the Norwegian volcanic margin
NASA Astrophysics Data System (ADS)
Breivik, A. J.; Faleide, J. I.; Mjelde, R.
2007-12-01
The Vøring margin off mid-Norway was initiated during the earliest Eocene (~54 Ma), and large volumes of magmatic rocks were emplaced during and after continental breakup. In 2003, an ocean bottom seismometer survey was acquired on the Norwegian margin to constrain continental breakup and early seafloor spreading processes. The profile P-wave model described here crosses the northern part of the Vøring Plateau. Maximum igneous crustal thickness was found to be 18 km, decreasing to ~6.5 km over ~6 M.y. after continental breakup. Both the volume and the duration of excess magmatism after breakup is about twice of what is observed off the Møre Margin south of the Jan Mayen Fracture Zone, which offsets the margin segments by ~170 km. A similar reduction in magmatism occurs to the north over an along-margin distance of ~100 km to the Lofoten margin, but without a margin offset. There is a strong correlation between magma productivity and early plate spreading rate, which are highest just after breakup, falling with time. This is seen both at the Møre and the Vøring margin segments, suggesting a common cause. A model for the breakup- related magmatism should be able to (1) explain this correlation, (2) the magma production peak at breakup, and (3) the magmatic segmentation. Proposed end-member hypotheses are elevated upper-mantle temperatures caused by a hot mantle plume, or edge-driven small-scale convection fluxing mantle rocks through the melt zone. Both the average P-wave velocity and the major-element data at the Vøring margin indicate a low degree of melting consistent with convection. However, small scale convection does not easily explain the issues listed above. An elaboration of the mantle plume model by N. Sleep, in which buoyant plume material fills the rift-topography at the base of the lithosphere, can explain these: When the continents break apart, the buoyant plume-material flows up into the rift zone, causing excess magmatism by both elevated temperature and excess flux, and magmatism dies off as this rift-restricted material is spent. The buoyancy of the plume-material also elevates the plate boundaries and enhances plate spreading forces initially. The rapid drop in magma productivity to the north correlates with the northern boundary of the wide and deep Cretaceous Vøring Basin, thus less plume material was accommodated off Lofoten. This model predicts that the magma segmentation will show little variation in the geochemical signature.
Ruettger, Anke; Nieter, Johanna; Skrypnyk, Artem; Engelmann, Ines; Ziegler, Albrecht; Moser, Irmgard; Monecke, Stefan; Ehricht, Ralf
2012-01-01
Membrane-based spoligotyping has been converted to DNA microarray format to qualify it for high-throughput testing. We have shown the assay's validity and suitability for direct typing from tissue and detecting new spoligotypes. Advantages of the microarray methodology include rapidity, ease of operation, automatic data processing, and affordability. PMID:22553239
Ruettger, Anke; Nieter, Johanna; Skrypnyk, Artem; Engelmann, Ines; Ziegler, Albrecht; Moser, Irmgard; Monecke, Stefan; Ehricht, Ralf; Sachse, Konrad
2012-07-01
Membrane-based spoligotyping has been converted to DNA microarray format to qualify it for high-throughput testing. We have shown the assay's validity and suitability for direct typing from tissue and detecting new spoligotypes. Advantages of the microarray methodology include rapidity, ease of operation, automatic data processing, and affordability.
Applications of microarray technology in breast cancer research
Cooper, Colin S
2001-01-01
Microarrays provide a versatile platform for utilizing information from the Human Genome Project to benefit human health. This article reviews the ways in which microarray technology may be used in breast cancer research. Its diverse applications include monitoring chromosome gains and losses, tumour classification, drug discovery and development, DNA resequencing, mutation detection and investigating the mechanism of tumour development. PMID:11305951
ERIC Educational Resources Information Center
Chang, Ming-Mei; Briggs, George M.
2007-01-01
DNA microarrays are microscopic arrays on a solid surface, typically a glass slide, on which DNA oligonucleotides are deposited or synthesized in a high-density matrix with a predetermined spatial order. Several types of DNA microarrays have been developed and used for various biological studies. Here, we developed an undergraduate laboratory…
Deciphering the glycosaminoglycan code with the help of microarrays.
de Paz, Jose L; Seeberger, Peter H
2008-07-01
Carbohydrate microarrays have become a powerful tool to elucidate the biological role of complex sugars. Microarrays are particularly useful for the study of glycosaminoglycans (GAGs), a key class of carbohydrates. The high-throughput chip format enables rapid screening of large numbers of potential GAG sequences produced via a complex biosynthesis while consuming very little sample. Here, we briefly highlight the most recent advances involving GAG microarrays built with synthetic or naturally derived oligosaccharides. These chips are powerful tools for characterizing GAG-protein interactions and determining structure-activity relationships for specific sequences. Thereby, they contribute to decoding the information contained in specific GAG sequences.
Walt, David R
2010-01-01
This tutorial review describes how fibre optic microarrays can be used to create a variety of sensing and measurement systems. This review covers the basics of optical fibres and arrays, the different microarray architectures, and describes a multitude of applications. Such arrays enable multiplexed sensing for a variety of analytes including nucleic acids, vapours, and biomolecules. Polymer-coated fibre arrays can be used for measuring microscopic chemical phenomena, such as corrosion and localized release of biochemicals from cells. In addition, these microarrays can serve as a substrate for fundamental studies of single molecules and single cells. The review covers topics of interest to chemists, biologists, materials scientists, and engineers.
A database for the analysis of immunity genes in Drosophila: PADMA database.
Lee, Mark J; Mondal, Ariful; Small, Chiyedza; Paddibhatla, Indira; Kawaguchi, Akira; Govind, Shubha
2011-01-01
While microarray experiments generate voluminous data, discerning trends that support an existing or alternative paradigm is challenging. To synergize hypothesis building and testing, we designed the Pathogen Associated Drosophila MicroArray (PADMA) database for easy retrieval and comparison of microarray results from immunity-related experiments (www.padmadatabase.org). PADMA also allows biologists to upload their microarray-results and compare it with datasets housed within PADMA. We tested PADMA using a preliminary dataset from Ganaspis xanthopoda-infected fly larvae, and uncovered unexpected trends in gene expression, reshaping our hypothesis. Thus, the PADMA database will be a useful resource to fly researchers to evaluate, revise, and refine hypotheses.
Schönmann, Susan; Loy, Alexander; Wimmersberger, Céline; Sobek, Jens; Aquino, Catharine; Vandamme, Peter; Frey, Beat; Rehrauer, Hubert; Eberl, Leo
2009-04-01
For cultivation-independent and highly parallel analysis of members of the genus Burkholderia, an oligonucleotide microarray (phylochip) consisting of 131 hierarchically nested 16S rRNA gene-targeted oligonucleotide probes was developed. A novel primer pair was designed for selective amplification of a 1.3 kb 16S rRNA gene fragment of Burkholderia species prior to microarray analysis. The diagnostic performance of the microarray for identification and differentiation of Burkholderia species was tested with 44 reference strains of the genera Burkholderia, Pandoraea, Ralstonia and Limnobacter. Hybridization patterns based on presence/absence of probe signals were interpreted semi-automatically using the novel likelihood-based strategy of the web-tool Phylo- Detect. Eighty-eight per cent of the reference strains were correctly identified at the species level. The evaluated microarray was applied to investigate shifts in the Burkholderia community structure in acidic forest soil upon addition of cadmium, a condition that selected for Burkholderia species. The microarray results were in agreement with those obtained from phylogenetic analysis of Burkholderia 16S rRNA gene sequences recovered from the same cadmiumcontaminated soil, demonstrating the value of the Burkholderia phylochip for determinative and environmental studies.
Cheng, Ningtao; Wu, Leihong; Cheng, Yiyu
2013-01-01
The promise of microarray technology in providing prediction classifiers for cancer outcome estimation has been confirmed by a number of demonstrable successes. However, the reliability of prediction results relies heavily on the accuracy of statistical parameters involved in classifiers. It cannot be reliably estimated with only a small number of training samples. Therefore, it is of vital importance to determine the minimum number of training samples and to ensure the clinical value of microarrays in cancer outcome prediction. We evaluated the impact of training sample size on model performance extensively based on 3 large-scale cancer microarray datasets provided by the second phase of MicroArray Quality Control project (MAQC-II). An SSNR-based (scale of signal-to-noise ratio) protocol was proposed in this study for minimum training sample size determination. External validation results based on another 3 cancer datasets confirmed that the SSNR-based approach could not only determine the minimum number of training samples efficiently, but also provide a valuable strategy for estimating the underlying performance of classifiers in advance. Once translated into clinical routine applications, the SSNR-based protocol would provide great convenience in microarray-based cancer outcome prediction in improving classifier reliability. PMID:23861920
Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S
2010-05-21
Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Development and characterization of a disposable plastic microarray printhead.
Griessner, Matthias; Hartig, Dave; Christmann, Alexander; Pohl, Carsten; Schellhase, Michaela; Ehrentreich-Förster, Eva
2011-06-01
During the last decade microarrays have become a powerful analytical tool. Commonly microarrays are produced in a non-contact manner using silicone printheads. However, silicone printheads are expensive and not able to be used as a disposable. Here, we show the development and functional characterization of 8-channel plastic microarray printheads that overcome both disadvantages of their conventional silicone counterparts. A combination of injection-molding and laser processing allows us to produce a high quantity of cheap, customizable and disposable microarray printheads. The use of plastics (e.g., polystyrene) minimizes the need for surface modifications required previously for proper printing results. Time-consuming regeneration processes, cleaning procedures and contaminations caused by residual samples are avoided. The utilization of plastic printheads for viscous liquids, such as cell suspensions or whole blood, is possible. Furthermore, functional parts within the plastic printhead (e.g., particle filters) can be included. Our printhead is compatible with commercially available TopSpot devices but provides additional economic and technical benefits as compared to conventional TopSpot printheads, while fulfilling all requirements demanded on the latter. All in all, this work describes how the field of traditional microarray spotting can be extended significantly by low cost plastic printheads.
Optimal Control of Shock Wave Turbulent Boundary Layer Interactions Using Micro-Array Actuation
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Tinapple, Jon; Surber, Lewis
2006-01-01
The intent of this study on micro-array flow control is to demonstrate the viability and economy of Response Surface Methodology (RSM) to determine optimal designs of micro-array actuation for controlling the shock wave turbulent boundary layer interactions within supersonic inlets and compare these concepts to conventional bleed performance. The term micro-array refers to micro-actuator arrays which have heights of 25 to 40 percent of the undisturbed supersonic boundary layer thickness. This study covers optimal control of shock wave turbulent boundary layer interactions using standard micro-vane, tapered micro-vane, and standard micro-ramp arrays at a free stream Mach number of 2.0. The effectiveness of the three micro-array devices was tested using a shock pressure rise induced by the 10 shock generator, which was sufficiently strong as to separate the turbulent supersonic boundary layer. The overall design purpose of the micro-arrays was to alter the properties of the supersonic boundary layer by introducing a cascade of counter-rotating micro-vortices in the near wall region. In this manner, the impact of the shock wave boundary layer (SWBL) interaction on the main flow field was minimized without boundary bleed.
Tra, Yolande V; Evans, Irene M
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course.
Design of microarray experiments for genetical genomics studies.
Bueno Filho, Júlio S S; Gilmour, Steven G; Rosa, Guilherme J M
2006-10-01
Microarray experiments have been used recently in genetical genomics studies, as an additional tool to understand the genetic mechanisms governing variation in complex traits, such as for estimating heritabilities of mRNA transcript abundances, for mapping expression quantitative trait loci, and for inferring regulatory networks controlling gene expression. Several articles on the design of microarray experiments discuss situations in which treatment effects are assumed fixed and without any structure. In the case of two-color microarray platforms, several authors have studied reference and circular designs. Here, we discuss the optimal design of microarray experiments whose goals refer to specific genetic questions. Some examples are used to illustrate the choice of a design for comparing fixed, structured treatments, such as genotypic groups. Experiments targeting single genes or chromosomic regions (such as with transgene research) or multiple epistatic loci (such as within a selective phenotyping context) are discussed. In addition, microarray experiments in which treatments refer to families or to subjects (within family structures or complex pedigrees) are presented. In these cases treatments are more appropriately considered to be random effects, with specific covariance structures, in which the genetic goals relate to the estimation of genetic variances and the heritability of transcriptional abundances.
WebArray: an online platform for microarray data analysis
Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng
2005-01-01
Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165
Support vector machine and principal component analysis for microarray data classification
NASA Astrophysics Data System (ADS)
Astuti, Widi; Adiwijaya
2018-03-01
Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.
Microarray-based screening of heat shock protein inhibitors.
Schax, Emilia; Walter, Johanna-Gabriela; Märzhäuser, Helene; Stahl, Frank; Scheper, Thomas; Agard, David A; Eichner, Simone; Kirschning, Andreas; Zeilinger, Carsten
2014-06-20
Based on the importance of heat shock proteins (HSPs) in diseases such as cancer, Alzheimer's disease or malaria, inhibitors of these chaperons are needed. Today's state-of-the-art techniques to identify HSP inhibitors are performed in microplate format, requiring large amounts of proteins and potential inhibitors. In contrast, we have developed a miniaturized protein microarray-based assay to identify novel inhibitors, allowing analysis with 300 pmol of protein. The assay is based on competitive binding of fluorescence-labeled ATP and potential inhibitors to the ATP-binding site of HSP. Therefore, the developed microarray enables the parallel analysis of different ATP-binding proteins on a single microarray. We have demonstrated the possibility of multiplexing by immobilizing full-length human HSP90α and HtpG of Helicobacter pylori on microarrays. Fluorescence-labeled ATP was competed by novel geldanamycin/reblastatin derivatives with IC50 values in the range of 0.5 nM to 4 μM and Z(*)-factors between 0.60 and 0.96. Our results demonstrate the potential of a target-oriented multiplexed protein microarray to identify novel inhibitors for different members of the HSP90 family. Copyright © 2014 Elsevier B.V. All rights reserved.
Sun, Xiuhua; Wang, Huaixin; Wang, Yuanyuan; Gui, Taijiang; Wang, Ke; Gao, Changlu
2018-04-15
Nonspecific binding or adsorption of biomolecules presents as a major obstacle to higher sensitivity, specificity and reproducibility in microarray technology. We report herein a method to fabricate antifouling microarray via photopolymerization of biomimetic betaine compounds. In brief, carboxybetaine methacrylate was polymerized as arrays for protein sensing, while sulfobetaine methacrylate was polymerized as background. With the abundant carboxyl groups on array surfaces and zwitterionic polymers on the entire surfaces, this microarray allows biomolecular immobilization and recognition with low nonspecific interactions due to its antifouling property. Therefore, low concentration of target molecules can be captured and detected by this microarray. It was proved that a concentration of 10ngmL -1 bovine serum albumin in the sample matrix of bovine serum can be detected by the microarray derivatized with anti-bovine serum albumin. Moreover, with proper hydrophilic-hydrophobic designs, this approach can be applied to fabricate surface-tension droplet arrays, which allows surface-directed cell adhesion and growth. These light controllable approaches constitute a clear improvement in the design of antifouling interfaces, which may lead to greater flexibility in the development of interfacial architectures and wider application in blood contact microdevices. Copyright © 2017 Elsevier B.V. All rights reserved.
Evans, Irene M.
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954
EDGE3: A web-based solution for management and analysis of Agilent two color microarray experiments
Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A
2009-01-01
Background The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE3 was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. Results EDGE3 has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE3 is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Conclusion Here, we present EDGE3, an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE3 provides a means for managing RNA samples and arrays during the hybridization process. EDGE3 is freely available for download at . PMID:19732451
Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A
2009-09-04
The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE(3) was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. EDGE(3) has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE(3) is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Here, we present EDGE(3), an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE(3) provides a means for managing RNA samples and arrays during the hybridization process. EDGE(3) is freely available for download at http://edge.oncology.wisc.edu/.
Crustal anisotropy in the forearc of the Northern Cascadia Subduction Zone, British Columbia
NASA Astrophysics Data System (ADS)
Balfour, N. J.; Cassidy, J. F.; Dosso, S. E.
2012-01-01
This paper aims to identify sources and variations of crustal anisotropy from shear-wave splitting measurements in the forearc of the Northern Cascadia Subduction Zone of southwest British Columbia. Over 20 permanent stations and 15 temporary stations were available for shear-wave splitting analysis on ˜4500 event-station pairs for local crustal earthquakes. Results from 1100 useable shear-wave splitting measurements show spatial variations in fast directions, with margin-parallel fast directions at most stations and margin-perpendicular fast directions at stations in the northeast of the region. Crustal anisotropy is often attributed to stress and has been interpreted as the fast direction being related to the orientation of the maximum horizontal compressive stress. However, studies have also shown anisotropy can be complicated by crustal structure. Southwest British Columbia is a complex region of crustal deformation and some of the stations are located near large ancient faults. To use seismic anisotropy as a stress indicator requires identifying which stations are influenced by stress and which by structure. We determine the source of anisotropy at each station by comparing fast directions from shear-wave splitting results to the maximum horizontal compressive stress orientation determined from earthquake focal mechanism inversion. Most stations show agreement between the fast direction and the maximum horizontal compressive stress. This suggests that anisotropy is related to stress-aligned fluid-filled microcracks based on extensive dilatancy anisotropy. These stations are further analysed for temporal variations to lay groundwork for monitoring temporal changes in the stress over extended time periods. Determining the sources of variability in anisotropy can lead to a better understanding of the crustal structure and stress, and in the future may be used as a monitoring and mapping tool.
Magnitude of Interfractional Vaginal Cuff Movement: Implications for External Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Daniel J.; Michaletz-Lorenz, Martha; Goddu, S. Murty
2012-03-15
Purpose: To quantify the extent of interfractional vaginal cuff movement in patients receiving postoperative irradiation for cervical or endometrial cancer in the absence of bowel/bladder instruction. Methods and Materials: Eleven consecutive patients with cervical or endometrial cancer underwent placement of three gold seed fiducial markers in the vaginal cuff apex as part of standard of care before simulation. Patients subsequently underwent external irradiation and brachytherapy treatment based on institutional guidelines. Daily megavoltage CT imaging was performed during each external radiation treatment fraction. The daily positions of the vaginal apex fiducial markers were subsequently compared with the original position of themore » fiducial markers on the simulation CT. Composite dose-volume histograms were also created by summing daily target positions. Results: The average ({+-} standard deviation) vaginal cuff movement throughout daily pelvic external radiotherapy when referenced to the simulation position was 16.2 {+-} 8.3 mm. The maximum vaginal cuff movement for any patient during treatment was 34.5 mm. In the axial plane the mean vaginal cuff movement was 12.9 {+-} 6.7 mm. The maximum vaginal cuff axial movement was 30.7 mm. In the craniocaudal axis the mean movement was 10.3 {+-} 7.6 mm, with a maximum movement of 27.0 mm. Probability of cuff excursion outside of the clinical target volume steadily dropped as margin size increased (53%, 26%, 4.2%, and 1.4% for 1.0, 1.5, 2.0, and 2.5 cm, respectively.) However, rectal and bladder doses steadily increased with larger margin sizes. Conclusions: The magnitude of vaginal cuff movement is highly patient specific and can impact target coverage in patients without bowel/bladder instructions at simulation. The use of vaginal cuff fiducials can help identify patients at risk for target volume excursion.« less
Structure and evolution of the NE Atlantic conjugate margins off Norway and Greenland (Invited)
NASA Astrophysics Data System (ADS)
Faleide, J.; Planke, S.; Theissen-Krah, S.; Abdelmalak, M.; Zastrozhnov, D.; Tsikalas, F.; Breivik, A. J.; Torsvik, T. H.; Gaina, C.; Schmid, D. W.; Myklebust, R.; Mjelde, R.
2013-12-01
The continental margins off Norway and NE Greenland evolved in response to the Cenozoic opening of the NE Atlantic. The margins exhibit a distinct along-margin segmentation reflecting structural inheritance extending back to a complex pre-breakup geological history. The sedimentary basins at the conjugate margins developed as a result of multiple phases of post-Caledonian rifting from Late Paleozoic time to final NE Atlantic breakup at the Paleocene-Eocene transition. The >200 million years of repeated extension caused comprehensive crustal thinning and formation of deep sedimentary basins. The main rift phases span the following time intervals: Late Permian, late Middle Jurassic-earliest Cretaceous, Early-mid Cretaceous and Late Cretaceous-Paleocene. The late Mesozoic-early Cenozoic rifting was related to the northward propagation of North Atlantic sea floor spreading, but also linked to important tectonic events in the Arctic. The pre-drift extension is quantified based on observed geometries of crustal thinning and stretching factors derived from tectonic modeling. The total (cumulative) pre-drift extension amounts to in the order of 300 km which correlates well with estimates from plate reconstructions based on paleomagnetic data. Final lithospheric breakup at the Paleocene-Eocene transition culminated in a 3-6 m.y. period of massive magmatic activity during breakup and onset of early sea-floor spreading, forming a part of the North Atlantic Volcanic Province. At the outer parts of the conjugate margins, the lavas form characteristic seaward dipping reflector sequences and lava deltas that drilling has demonstrated to be subaerially and/or neritically erupted basalts. The continent-ocean transition is usually well defined as a rapid increase of P-wave velocities at mid- to lower-crustal levels. Maximum igneous crustal thickness of about 18 km is found across the outer Vøring Plateau on the Norwegian Margin, and lower-crustal P-wave velocities of up to 7.3 km/s are found at the bottom of the igneous crust here. The igneous crust, including the characteristic 7+ km/s lower crustal body, is even thicker on the East Greenland Margin. During the main igneous episode, sills intruded into the thick Cretaceous successions throughout the NE Atlantic margins. Strong crustal reflections can be mapped widespread on both conjugate margins. In some areas they are associated with the top of the high-velocity lower crustal body, in other areas they may represent deeply buried sedimentary sequence boundaries or moho at the base of the crust. Following breakup, the subsiding margins experienced modest sedimentation until the late Pliocene when large wedges of glacial sediments prograded into the deep ocean from uplifted areas along the continental margins. The outbuilding was probably initiated in Miocene time indicating pre-glacial tectonic uplift of Greenland, Fennoscandia and the Barents Shelf. The NE Atlantic margins also reveal evidence of widespread Cenozoic compressional deformation.
[Oligonucleotide microarray for subtyping avian influenza virus].
Xueqing, Han; Xiangmei, Lin; Yihong, Hou; Shaoqiang, Wu; Jian, Liu; Lin, Mei; Guangle, Jia; Zexiao, Yang
2008-09-01
Avian influenza viruses are important human and animal respiratory pathogens and rapid diagnosis of novel emerging avian influenza viruses is vital for effective global influenza surveillance. We developed an oligonucleotide microarray-based method for subtyping all avian influenza virus (16 HA and 9 NA subtypes). In total 25 pairs of primers specific for different subtypes and 1 pair of universal primers were carefully designed based on the genomic sequences of influenza A viruses retrieved from GenBank database. Several multiplex RT-PCR methods were then developed, and the target cDNAs of 25 subtype viruses were amplified by RT-PCR or overlapping PCR for evaluating the microarray. Further 52 oligonucleotide probes specific for all 25 subtype viruses were designed according to published gene sequences of avian influenza viruses in amplified target cDNAs domains, and a microarray for subtyping influenza A virus was developed. Then its specificity and sensitivity were validated by using different subtype strains and 2653 samples from 49 different areas. The results showed that all the subtypes of influenza virus could be identified simultaneously on this microarray with high sensitivity, which could reach to 2.47 pfu/mL virus or 2.5 ng target DNA. Furthermore, there was no cross reaction with other avian respiratory virus. An oligonucleotide microarray-based strategy for detection of avian influenza viruses has been developed. Such a diagnostic microarray will be useful in discovering and identifying all subtypes of avian influenza virus.
Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L
2010-07-01
This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out.
Wilson, P R
1996-07-01
The marginal adaptation of full coverage restorations is adversely affected by the introduction of luting agents of various minimum film thicknesses during the cementation process. The increase in the marginal opening may have long-term detrimental effects on the health of both pulpal and periodontal tissues. The purpose of this study was to determine the effects of varying seating forces (2.5, 12.5, 25 N), venting, and cement types on post-cementation marginal elevation in cast crowns. A standardized cement space of 40 microns was provided between a machined gold crown and a stainless steel die. An occlusal vent was placed that could be opened or closed. The post-cementation crown elevation was measured, following the use of two commercially available capsulated dental cements (Phosphacap, and Ketac-cem Applicap). The results indicate that only the combination of Ketac-Cem Applicap and crown venting produced post-cementation crown elevation of less than 20 microns when 12.5 N seating force was used. Higher forces (25 N) and venting were required for comparable seating when using Phosphacap (19 microns). The amount of force required to allow maximum seating of cast crowns appears to be cement specific, and is reduced by effective venting procedures.
Stevenson, David A; Carey, John C; Cowley, Brett C; Bayrak-Toydemir, Pinar; Mao, Rong; Brothman, Arthur R
2004-12-01
We report a de novo cryptic 11p duplication found by genomic microarray with a cytogenetically detected 4p deletion. Terminal 4p deletions cause Wolf-Hirschhorn syndrome, but the phenotype probably was modified by the paternally derived 11p duplication. This emphasizes the clinical utility of genomic microarray.
DNA microarrays and their use in dermatology.
Mlakar, Vid; Glavac, Damjan
2007-03-01
Multiple different DNA microarray technologies are available on the market today. They can be used for studying either DNA or RNA with the purpose of identifying and explaining the role of genes involved in different processes. This paper reviews different DNA microarray platforms available for such studies and their usage in cases of malignant melanomas, psoriasis, and exposure of keratinocytes and melanocytes to UV illumination.
mRNA-Based Parallel Detection of Active Methanotroph Populations by Use of a Diagnostic Microarray
Bodrossy, Levente; Stralis-Pavese, Nancy; Konrad-Köszler, Marianne; Weilharter, Alexandra; Reichenauer, Thomas G.; Schöfer, David; Sessitsch, Angela
2006-01-01
A method was developed for the mRNA-based application of microbial diagnostic microarrays to detect active microbial populations. DNA- and mRNA-based analyses of environmental samples were compared and confirmed via quantitative PCR. Results indicated that mRNA-based microarray analyses may provide additional information on the composition and functioning of microbial communities. PMID:16461725
DNA Microarray Wet Lab Simulation Brings Genomics into the High School Curriculum
ERIC Educational Resources Information Center
Campbell, A. Malcolm; Zanta, Carolyn A.; Heyer, Laurie J.; Kittinger, Ben; Gabric, Kathleen M.; Adler, Leslie
2006-01-01
We have developed a wet lab DNA microarray simulation as part of a complete DNA microarray module for high school students. The wet lab simulation has been field tested with high school students in Illinois and Maryland as well as in workshops with high school teachers from across the nation. Instead of using DNA, our simulation is based on pH…
Identifying Beneficial Qualities of Trichoderma parareesei for Plants
Rubio, M. Belén; Quijada, Narciso M.; Pérez, Esclaudys; Domínguez, Sara; Hermosa, Rosa
2014-01-01
Trichoderma parareesei and Trichoderma reesei (teleomorph Hypocrea jecorina) produce cellulases and xylanases of industrial interest. Here, the anamorphic strain T6 (formerly T. reesei) has been identified as T. parareesei, showing biocontrol potential against fungal and oomycete phytopathogens and enhanced hyphal growth in the presence of tomato exudates or plant cell wall polymers in in vitro assays. A Trichoderma microarray was used to examine the transcriptomic changes in T6 at 20 h of interaction with tomato plants. Out of a total 34,138 Trichoderma probe sets deposited on the microarray, 250 showed a significant change of at least 2-fold in expression in the presence of tomato plants, with most of them being downregulated. T. parareesei T6 exerted beneficial effects on tomato plants in terms of seedling lateral root development, and in adult plants it improved defense against Botrytis cinerea and growth promotion under salt stress. Time course expression patterns (0 to 6 days) observed for defense-related genes suggest that T6 was able to prime defense responses in the tomato plants against biotic and abiotic stresses. Such responses undulated, with a maximum upregulation of the jasmonic acid (JA)/ethylene (ET)-related LOX1 and EIN2 genes and the salt tolerance SOS1 gene at 24 h and that of the salicylic acid (SA)-related PR-1 gene at 48 h after T6 inoculation. Our study demonstrates that the T. parareesei T6-tomato interaction is beneficial to both partners. PMID:24413597
Optimization of cDNA microarrays procedures using criteria that do not rely on external standards.
Bruland, Torunn; Anderssen, Endre; Doseth, Berit; Bergum, Hallgeir; Beisvag, Vidar; Laegreid, Astrid
2007-10-18
The measurement of gene expression using microarray technology is a complicated process in which a large number of factors can be varied. Due to the lack of standard calibration samples such as are used in traditional chemical analysis it may be a problem to evaluate whether changes done to the microarray procedure actually improve the identification of truly differentially expressed genes. The purpose of the present work is to report the optimization of several steps in the microarray process both in laboratory practices and in data processing using criteria that do not rely on external standards. We performed a cDNA microarry experiment including RNA from samples with high expected differential gene expression termed "high contrasts" (rat cell lines AR42J and NRK52E) compared to self-self hybridization, and optimized a pipeline to maximize the number of genes found to be differentially expressed in the "high contrasts" RNA samples by estimating the false discovery rate (FDR) using a null distribution obtained from the self-self experiment. The proposed high-contrast versus self-self method (HCSSM) requires only four microarrays per evaluation. The effects of blocking reagent dose, filtering, and background corrections methodologies were investigated. In our experiments a dose of 250 ng LNA (locked nucleic acid) dT blocker, no background correction and weight based filtering gave the largest number of differentially expressed genes. The choice of background correction method had a stronger impact on the estimated number of differentially expressed genes than the choice of filtering method. Cross platform microarray (Illumina) analysis was used to validate that the increase in the number of differentially expressed genes found by HCSSM was real. The results show that HCSSM can be a useful and simple approach to optimize microarray procedures without including external standards. Our optimizing method is highly applicable to both long oligo-probe microarrays which have become commonly used for well characterized organisms such as man, mouse and rat, as well as to cDNA microarrays which are still of importance for organisms with incomplete genome sequence information such as many bacteria, plants and fish.
Optimization of cDNA microarrays procedures using criteria that do not rely on external standards
Bruland, Torunn; Anderssen, Endre; Doseth, Berit; Bergum, Hallgeir; Beisvag, Vidar; Lægreid, Astrid
2007-01-01
Background The measurement of gene expression using microarray technology is a complicated process in which a large number of factors can be varied. Due to the lack of standard calibration samples such as are used in traditional chemical analysis it may be a problem to evaluate whether changes done to the microarray procedure actually improve the identification of truly differentially expressed genes. The purpose of the present work is to report the optimization of several steps in the microarray process both in laboratory practices and in data processing using criteria that do not rely on external standards. Results We performed a cDNA microarry experiment including RNA from samples with high expected differential gene expression termed "high contrasts" (rat cell lines AR42J and NRK52E) compared to self-self hybridization, and optimized a pipeline to maximize the number of genes found to be differentially expressed in the "high contrasts" RNA samples by estimating the false discovery rate (FDR) using a null distribution obtained from the self-self experiment. The proposed high-contrast versus self-self method (HCSSM) requires only four microarrays per evaluation. The effects of blocking reagent dose, filtering, and background corrections methodologies were investigated. In our experiments a dose of 250 ng LNA (locked nucleic acid) dT blocker, no background correction and weight based filtering gave the largest number of differentially expressed genes. The choice of background correction method had a stronger impact on the estimated number of differentially expressed genes than the choice of filtering method. Cross platform microarray (Illumina) analysis was used to validate that the increase in the number of differentially expressed genes found by HCSSM was real. Conclusion The results show that HCSSM can be a useful and simple approach to optimize microarray procedures without including external standards. Our optimizing method is highly applicable to both long oligo-probe microarrays which have become commonly used for well characterized organisms such as man, mouse and rat, as well as to cDNA microarrays which are still of importance for organisms with incomplete genome sequence information such as many bacteria, plants and fish. PMID:17949480
Erickson, A; Fisher, M; Furukawa-Stoffer, T; Ambagala, A; Hodko, D; Pasick, J; King, D P; Nfon, C; Ortega Polo, R; Lung, O
2018-04-01
Microarray technology can be useful for pathogen detection as it allows simultaneous interrogation of the presence or absence of a large number of genetic signatures. However, most microarray assays are labour-intensive and time-consuming to perform. This study describes the development and initial evaluation of a multiplex reverse transcription (RT)-PCR and novel accompanying automated electronic microarray assay for simultaneous detection and differentiation of seven important viruses that affect swine (foot-and-mouth disease virus [FMDV], swine vesicular disease virus [SVDV], vesicular exanthema of swine virus [VESV], African swine fever virus [ASFV], classical swine fever virus [CSFV], porcine respiratory and reproductive syndrome virus [PRRSV] and porcine circovirus type 2 [PCV2]). The novel electronic microarray assay utilizes a single, user-friendly instrument that integrates and automates capture probe printing, hybridization, washing and reporting on a disposable electronic microarray cartridge with 400 features. This assay accurately detected and identified a total of 68 isolates of the seven targeted virus species including 23 samples of FMDV, representing all seven serotypes, and 10 CSFV strains, representing all three genotypes. The assay successfully detected viruses in clinical samples from the field, experimentally infected animals (as early as 1 day post-infection (dpi) for FMDV and SVDV, 4 dpi for ASFV, 5 dpi for CSFV), as well as in biological material that were spiked with target viruses. The limit of detection was 10 copies/μl for ASFV, PCV2 and PRRSV, 100 copies/μl for SVDV, CSFV, VESV and 1,000 copies/μl for FMDV. The electronic microarray component had reduced analytical sensitivity for several of the target viruses when compared with the multiplex RT-PCR. The integration of capture probe printing allows custom onsite array printing as needed, while electrophoretically driven hybridization generates results faster than conventional microarrays that rely on passive hybridization. With further refinement, this novel, rapid, highly automated microarray technology has potential applications in multipathogen surveillance of livestock diseases. © 2017 Her Majesty the Queen in Right of Canada • Transboundary and Emerging Diseases.
NASA Astrophysics Data System (ADS)
Nabel, Moritz; Bueno Piaz Barbosa, Daniela; Horsch, David; Jablonowski, Nicolai David
2014-05-01
The global demand for energy security and the mitigation of climate change are the main drivers pushing energy-plant production in Germany. However, the cultivation of these plants can cause land use conflicts since agricultural soil is mostly used for plant production. A sustainable alternative to the conventional cultivation of food-based energy-crops is the cultivation of special adopted energy-plants on marginal lands. To further increase the sustainability of energy-plant cultivation systems the dependency on synthetic fertilizers needs to be reduced via closed nutrient loops. In the presented study the energy-plant Sida hermaphrodita (Malvaceae) will be used to evaluate the potential to grow this high potential energy-crop on a marginal sandy soil in combination with fertilization via digestate from biogas production. With this dose-response experiment we will further identify an optimum dose, which will be compared to equivalent doses of NPK-fertilizer. Further, lethal doses and deficiency doses will be observed. Two weeks old Sida seedlings were transplanted to 1L pots and fertilized with six doses of digestate (equivalent to a field application of 5, 10, 20, 40, 80, 160t/ha) and three equivalent doses of NPK-fertilizer. Control plants were left untreated. Sida plants will grow for 45 days under greenhouse conditions. We hypothesize that the nutrient status of the marginal soil can be increased and maintained by defined digestate applications, compared to control plants suffering of nutrient deficiency due to the low nutrient status in the marginal substrate. The dose of 40t/ha is expected to give a maximum biomass yield without causing toxicity symptoms. Results shall be used as basis for further experiments on the field scale in a field trial that was set up to investigate sustainable production systems for energy crop production under marginal soil conditions.
Breakup Style and Magmatic Underplating West of the Lofoten Islands, Norway, Based on OBS Data.
NASA Astrophysics Data System (ADS)
Breivik, A. J.; Faleide, J. I.; Mjelde, R.; Murai, Y.; Flueh, E. R.
2014-12-01
The breakup of the Northeast Atlantic in the Early Eocene was magma-rich, forming the major part of the North Atlantic Igneous Province (NAIP). This is seen as extrusive and intrusive magmatism in the continental domain, and as a thicker than normal oceanic crust produced the first few million years after continental breakup. The maximum magma productivity and the duration of excess magmatism varies along the margins of Northwest Europe and East Greenland, to some extent as a function of the distance from the Iceland hotspot. The Vøring Plateau off mid-Norway is the northernmost of the margin segments in northwestern Europe with extensive magmatism. North of the plateau, magmatism dies off towards the Lofoten Margin, marking the northern boundary of the NAIP here. In 2003, as part of the Euromargins Program we collected an Ocean Bottom Seismometer (OBS) profile from mainland Norway, across the Lofoten Islands, and out into the deep ocean. Forward velocity modeling using raytracing reveals a continental margin that shows transitional features between magma-rich and magma-poor rifting. On one hand, we detect an up to 2 km thick and 40-50 km wide magmatic underplate of the outer continent, on the other hand, continental thinning is greater and intrusive magmatism less than farther south. Continental breakup also appears to be somewhat delayed compared to breakup on the Vøring Plateau, consistent with increased extension. This indicates that magmatic diking, believed to quickly lead to continental breakup of volcanic margins and thus to reduce continental thinning, played a much lesser role here than at the plateau. Early post-breakup oceanic crust is up to 8 km thick, less than half of that observed farther south. The most likely interpretation of these observations, is that the source for the excess magmatism of the NAIP was not present at the Lofoten Margin during rifting, and that the excess magmatism actually observed was the result of lateral transport from the south around breakup time.
Deciphering the evolution of the last Eurasian ice sheets
NASA Astrophysics Data System (ADS)
Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge
2016-04-01
Glacial geologists need ice sheet-scale chronological reconstructions of former ice extent to set individual records in a wider context and compare interpretations of ice sheet response to records of past environmental changes. Ice sheet modellers require empirical reconstructions on size and volume of past ice sheets that are fully documented, specified in time and include uncertainty estimates for model validation or constraints. Motivated by these demands, in 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to compile and archive all published dates relevant to constraining the build-up and retreat of the last Eurasian ice sheets, including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets (BIIS, SIS and SBKIS respectively). Over 5000 dates were assessed for reliability and used together with published ice-sheet margin positions to reconstruct time-slice maps of the ice sheets' extent, with uncertainty bounds, every 1000 years between 25-10 kyr ago and at four additional periods back to 40 kyr ago. Ten years after the idea for a database was conceived, the first version of results (DATED-1) has now been released (Hughes et al. 2016). We observe that: i) both the BIIS and SBKIS achieve maximum extent, and commence retreat earlier than the larger SIS; ii) the eastern terrestrial margin of the SIS reached its maximum extent up to 7000 years later than the westernmost marine margin; iii) the combined maximum ice volume (~24 m sea-level equivalent) was reached c. 21 ka; iv) large uncertainties exist; predominantly across marine sectors (e.g. the timing of coalescence and separation of the SIS and BKIS) but also in well-studied areas due to conflicting yet equally robust data. In just three years since the DATED-1 census (1 January 2013), the volume of new information (from both dates and mapped glacial geomorphology) has grown significantly (~1000 new dates). Here, we present the DATED-1 results in the context of the climatic changes of the last glacial, discuss the implications of emerging post-census data, and describe plans for the next version of the database, DATED-2. Hughes, A. L. C., Gyllencreutz, R., Lohne, Ø. S., Mangerud, J., Svendsen, J. I. 2016: The last Eurasian ice sheets - a chronological database and time-slice reconstruction, DATED-1. Boreas, 45, 1-45. 10.1111/bor.12142
Alizadeh Oskoee, Parnian; Pournaghi Azar, Fatemeh; Jafari Navimipour, Elmira; Ebrahimi Chaharom, Mohammad Esmaeel; Naser Alavi, Fereshteh; Salari, Ashkan
2017-01-01
Background. One of the problems with composite resin restorations is gap formation at resin‒tooth interface. The present study evaluated the effect of preheating cycles of silorane- and dimethacrylate-based composite resins on gap formation at the gingival margins of Class V restorations. Methods. In this in vitro study, standard Class V cavities were prepared on the buccal surfaces of 48 bovine incisors. For restorative procedure, the samples were randomly divided into 2 groups based on the type of composite resin (group 1: di-methacrylate composite [Filtek Z250]; group 2: silorane composite [Filtek P90]) and each group was randomly divided into 2 subgroups based on the composite temperature (A: room temperature; B: after 40 preheating cycles up to 55°C). Marginal gaps were measured using a stereomicroscope at ×40 and analyzed with two-way ANOVA. Inter- and intra-group comparisons were analyzed with post-hoc Tukey tests. Significance level was defined at P < 0.05. Results. The maximum and minimum gaps were detected in groups 1-A and 2-B, respectively. The effects of composite resin type, preheating and interactive effect of these variables on gap formation were significant (P<0.001). Post-hoc Tukey tests showed greater gap in dimethacrylate compared to silorane composite resins (P< 0.001). In each group, gap values were greater in composite resins at room temperature compared to composite resins after 40 preheating cycles (P<0.001). Conclusion. Gap formation at the gingival margins of Class V cavities decreased due to preheating of both composite re-sins. Preheating of silorane-based composites can result in the best marginal adaptation.
Alizadeh Oskoee, Parnian; Pournaghi Azar, Fatemeh; Jafari Navimipour, Elmira; Ebrahimi chaharom, Mohammad Esmaeel; Naser Alavi, Fereshteh; Salari, Ashkan
2017-01-01
Background. One of the problems with composite resin restorations is gap formation at resin‒tooth interface. The present study evaluated the effect of preheating cycles of silorane- and dimethacrylate-based composite resins on gap formation at the gingival margins of Class V restorations. Methods. In this in vitro study, standard Class V cavities were prepared on the buccal surfaces of 48 bovine incisors. For restorative procedure, the samples were randomly divided into 2 groups based on the type of composite resin (group 1: di-methacrylate composite [Filtek Z250]; group 2: silorane composite [Filtek P90]) and each group was randomly divided into 2 subgroups based on the composite temperature (A: room temperature; B: after 40 preheating cycles up to 55°C). Marginal gaps were measured using a stereomicroscope at ×40 and analyzed with two-way ANOVA. Inter- and intra-group comparisons were analyzed with post-hoc Tukey tests. Significance level was defined at P < 0.05. Results. The maximum and minimum gaps were detected in groups 1-A and 2-B, respectively. The effects of composite resin type, preheating and interactive effect of these variables on gap formation were significant (P<0.001). Post-hoc Tukey tests showed greater gap in dimethacrylate compared to silorane composite resins (P< 0.001). In each group, gap values were greater in composite resins at room temperature compared to composite resins after 40 preheating cycles (P<0.001). Conclusion. Gap formation at the gingival margins of Class V cavities decreased due to preheating of both composite re-sins. Preheating of silorane-based composites can result in the best marginal adaptation. PMID:28413594
Statistical Use of Argonaute Expression and RISC Assembly in microRNA Target Identification
Stanhope, Stephen A.; Sengupta, Srikumar; den Boon, Johan; Ahlquist, Paul; Newton, Michael A.
2009-01-01
MicroRNAs (miRNAs) posttranscriptionally regulate targeted messenger RNAs (mRNAs) by inducing cleavage or otherwise repressing their translation. We address the problem of detecting m/miRNA targeting relationships in homo sapiens from microarray data by developing statistical models that are motivated by the biological mechanisms used by miRNAs. The focus of our modeling is the construction, activity, and mediation of RNA-induced silencing complexes (RISCs) competent for targeted mRNA cleavage. We demonstrate that regression models accommodating RISC abundance and controlling for other mediating factors fit the expression profiles of known target pairs substantially better than models based on m/miRNA expressions alone, and lead to verifications of computational target pair predictions that are more sensitive than those based on marginal expression levels. Because our models are fully independent of exogenous results from sequence-based computational methods, they are appropriate for use as either a primary or secondary source of information regarding m/miRNA target pair relationships, especially in conjunction with high-throughput expression studies. PMID:19779550
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, M; Chetty, I; Zhong, H
2014-06-01
Purpose: Tumor control probability (TCP) calculated with accumulated radiation doses may help design appropriate treatment margins. Image registration errors, however, may compromise the calculated TCP. The purpose of this study is to develop benchmark CT images to quantify registration-induced errors in the accumulated doses and their corresponding TCP. Methods: 4DCT images were registered from end-inhale (EI) to end-exhale (EE) using a “demons” algorithm. The demons DVFs were corrected by an FEM model to get realistic deformation fields. The FEM DVFs were used to warp the EI images to create the FEM-simulated images. The two images combined with the FEM DVFmore » formed a benchmark model. Maximum intensity projection (MIP) images, created from the EI and simulated images, were used to develop IMRT plans. Two plans with 3 and 5 mm margins were developed for each patient. With these plans, radiation doses were recalculated on the simulated images and warped back to the EI images using the FEM DVFs to get the accumulated doses. The Elastix software was used to register the FEM-simulated images to the EI images. TCPs calculated with the Elastix-accumulated doses were compared with those generated by the FEM to get the TCP error of the Elastix registrations. Results: For six lung patients, the mean Elastix registration error ranged from 0.93 to 1.98 mm. Their relative dose errors in PTV were between 0.28% and 6.8% for 3mm margin plans, and between 0.29% and 6.3% for 5mm-margin plans. As the PTV margin reduced from 5 to 3 mm, the mean TCP error of the Elastix-reconstructed doses increased from 2.0% to 2.9%, and the mean NTCP errors decreased from 1.2% to 1.1%. Conclusion: Patient-specific benchmark images can be used to evaluate the impact of registration errors on the computed TCPs, and may help select appropriate PTV margins for lung SBRT patients.« less
Sakulchairungreung, Bundit; Chirappapha, Prakasit; Suvikapakornkul, Ronnarat; Wasuthit, Yodying; Sukarayothin, Thongchai; Leesombatpaiboon, Montchai; Kongdan, Youwanush
2016-01-01
Background To determine the risk factors for disease recurrence after breast conserving therapy (BCT) for breast cancer in a group of South-East Asian women. Methods Medical and pathological records of women who underwent BCT during the 10-year period from 2001 to 2010 were reviewed. Data collected included age ≤35 years defined as the young, type of operation, pathological data, hormonal receptor (HR) status, human epidermal growth factor receptor-2 (HER-2) expression status, and surgical margin status. Data on adjuvant therapy were also collected. Main outcomes were overall breast cancer recurrence, locoregional, and distant recurrence. Risk factors for each type of recurrence were identified using Cox proportional hazards regression models. Results There were 294 BCTs in 290 patients during the study period. The overwhelming majority (91%) had early stage (stages I-II) breast cancers. Young age patients constituted 9% of all patients, and triple negative cancers (HR negative and HER-2 negative) were seen in 19%. Involved margins on initial surgery were found in 9% of cases, and after reoperation, only 2% had involved margins. After a median follow-up of 50 months, and a maximum follow-up of 135 months, there were 30 recurrences and 6 deaths. Of the 30 recurrences, 19 included locoregional, 20 included distant, and 13 had in-breast recurrences. The disease-free survival at 10 years was 82.5% (95% CI: 74.8% to 88.1%), and the cumulative in-breast recurrence was 9.3% (95% CI: 4.9% to 17.2%) at 10 years. Multivariable Cox regression analysis revealed that young age, larger tumor size, involved margins, and no breast irradiation were associated with higher risk of locoregional recurrence. Triple negative status, larger tumor size, more positive nodes, and involved margins were associated with higher risk of distant recurrence. Conclusions We found young age to be a significant prognosticator of locoregional recurrence, and triple negative status of distant recurrence. Involved surgical margin status was associated with both recurrences. Tumor size was associated with both recurrences, and axillary lymph node metastasis was associated with distant recurrence. PMID:26855904
NASA Astrophysics Data System (ADS)
Abdelmalak, M. M.; Planke, S.; Millett, J.; Jerram, D. A.; Maharjan, D.; Zastrozhnov, D.; Schmid, D. W.; Faleide, J. I.; Svensen, H.; Myklebust, R.
2017-12-01
The Vøring Margin offshore mid-Norway is a classic volcanic rifted margin, characterized by voluminous Paleogene igneous rocks present on both sides of the continent-ocean boundary. The margin displays (1) thickened transitional crust with a well-defined lower crustal high-velocity body and prominent deep crustal reflections, the so-called T-Reflection, (2) seaward dipping reflector (SDR) wedges and a prominent northeast-trending escarpment on the Vøring Marginal High, and (3) extensive sill complexes in the adjacent Cretaceous Vøring Basin. During the last decade, new 2D and 3D industry seismic data along with improved processing techniques, such as broadband processing and noise reduction processing sequences, have made it possible to image and map the breakup igneous complex in much greater detail than previously possible. Our interpretation includes a combination of (1) seismic horizon picking, (2) integrated seismic-gravity-magnetic (SGM) interpretation, (3) seismic volcanostratigraphy, and (4) igneous seismic geomorphology. The results are integrated with published wide-angle seismic data, re-analyzed borehole data including new geochronology, and new geodynamic modeling of the effects of magmatism on the thermal history and subsidence of the margin. The extensive sill complexes and associated hydrothermal vent complexes in the Vøring Basin have a Paleocene-Eocene boundary age based on high-precision U/Pb dating combined with seismic mapping constraints. On the marginal high, our results show a highly variable crustal structure, with a pre-breakup configuration consisting of large-scale structural highs and sedimentary basins. These structures were in-filled and covered by basalt flows and volcanogenic sediments during the early stages of continental breakup in the earliest Eocene. Subsequently, rift basins developed along the continent-ocean boundary and where infilled by up to ca. 6 km thick basalt sequences, currently imaged as SDRs fed by a dike swarm imaged on seismic data. The addition of magma within the crust had a prominent effect on the thermal history and hydrocarbon maturation of the sedimentary basin, causing uplift, delayed subsidence, and possibly contributing to the triggering of global warming during the Paleocene-Eocene Thermal Maximum (PETM).
NASA Astrophysics Data System (ADS)
Brazhnik, Kristina; Sokolova, Zinaida; Baryshnikova, Maria; Bilan, Regina; Nabiev, Igor; Sukhanova, Alyona
Multiplexed analysis of cancer markers is crucial for early tumor diagnosis and screening. We have designed lab-on-a-bead microarray for quantitative detection of three breast cancer markers in human serum. Quantum dots were used as bead-bound fluorescent tags for identifying each marker by means of flow cytometry. Antigen-specific beads reliably detected CA 15-3, CEA, and CA 125 in serum samples, providing clear discrimination between the samples with respect to the antigen levels. The novel microarray is advantageous over the routine single-analyte ones due to the simultaneous detection of various markers. Therefore the developed microarray is a promising tool for serum tumor marker profiling.
Emergent FDA biodefense issues for microarray technology: process analytical technology.
Weinberg, Sandy
2004-11-01
A successful biodefense strategy relies upon any combination of four approaches. A nation can protect its troops and citizenry first by advanced mass vaccination, second, by responsive ring vaccination, and third, by post-exposure therapeutic treatment (including vaccine therapies). Finally, protection can be achieved by rapid detection followed by exposure limitation (suites and air filters) or immediate treatment (e.g., antibiotics, rapid vaccines and iodine pills). All of these strategies rely upon or are enhanced by microarray technologies. Microarrays can be used to screen, engineer and test vaccines. They are also used to construct early detection tools. While effective biodefense utilizes a variety of tactical tools, microarray technology is a valuable arrow in that quiver.
NASA Astrophysics Data System (ADS)
Shi, Lei; Chu, Zhenyu; Dong, Xueliang; Jin, Wanqin; Dempsey, Eithne
2013-10-01
Highly oriented growth of a hybrid microarray was realized by a facile template-free method on gold substrates for the first time. The proposed formation mechanism involves an interfacial structure-directing force arising from self-assembled monolayers (SAMs) between gold substrates and hybrid crystals. Different SAMs and variable surface coverage of the assembled molecules play a critical role in the interfacial directing forces and influence the morphologies of hybrid films. A highly oriented hybrid microarray was formed on the highly aligned and vertical SAMs of 1,4-benzenedithiol molecules with rigid backbones, which afforded an intense structure-directing power for the oriented growth of hybrid crystals. Additionally, the density of the microarray could be adjusted by controlling the surface coverage of assembled molecules. Based on the hybrid microarray modified electrode with a large specific area (ca. 10 times its geometrical area), a label-free electrochemical DNA biosensor was constructed for the detection of an oligonucleotide fragment of the avian flu virus H5N1. The DNA biosensor displayed a significantly low detection limit of 5 pM (S/N = 3), a wide linear response from 10 pM to 10 nM, as well as excellent selectivity, good regeneration and high stability. We expect that the proposed template-free method can provide a new reference for the fabrication of a highly oriented hybrid array and the as-prepared microarray modified electrode will be a promising paradigm in constructing highly sensitive and selective biosensors.Highly oriented growth of a hybrid microarray was realized by a facile template-free method on gold substrates for the first time. The proposed formation mechanism involves an interfacial structure-directing force arising from self-assembled monolayers (SAMs) between gold substrates and hybrid crystals. Different SAMs and variable surface coverage of the assembled molecules play a critical role in the interfacial directing forces and influence the morphologies of hybrid films. A highly oriented hybrid microarray was formed on the highly aligned and vertical SAMs of 1,4-benzenedithiol molecules with rigid backbones, which afforded an intense structure-directing power for the oriented growth of hybrid crystals. Additionally, the density of the microarray could be adjusted by controlling the surface coverage of assembled molecules. Based on the hybrid microarray modified electrode with a large specific area (ca. 10 times its geometrical area), a label-free electrochemical DNA biosensor was constructed for the detection of an oligonucleotide fragment of the avian flu virus H5N1. The DNA biosensor displayed a significantly low detection limit of 5 pM (S/N = 3), a wide linear response from 10 pM to 10 nM, as well as excellent selectivity, good regeneration and high stability. We expect that the proposed template-free method can provide a new reference for the fabrication of a highly oriented hybrid array and the as-prepared microarray modified electrode will be a promising paradigm in constructing highly sensitive and selective biosensors. Electronic supplementary information (ESI) available: Four-probe method for determining the conductivity of the hybrid crystal (Fig. S1); stability comparisons of the hybrid films (Fig. S2); FESEM images of the hybrid microarray (Fig. S3); electrochemical characterizations of the hybrid films (Fig. S4); DFT simulations (Fig. S5); cross-sectional FESEM image of the hybrid microarray (Fig. S6); regeneration and stability tests of the DNA biosensor (Fig. S7). See DOI: 10.1039/c3nr03097k
Wang, Hong; Bi, Yongyi; Tao, Ning; Wang, Chunhong
2005-08-01
To detect the differential expression of cell signal transduction genes associated with benzene poisoning, and to explore the pathogenic mechanisms of blood system damage induced by benzene. Peripheral white blood cell gene expression profile of 7 benzene poisoning patients, including one aplastic anemia, was determined by cDNA microarray. Seven chips from normal workers were served as controls. Cluster analysis of gene expression profile was performed. Among the 4265 target genes, 176 genes associated with cell signal transduction were differentially expressed. 35 up-regulated genes including PTPRC, STAT4, IFITM1 etc were found in at least 6 pieces of microarray; 45 down-regulated genes including ARHB, PPP3CB, CDC37 etc were found in at least 5 pieces of microarray. cDNA microarray technology is an effective technique for screening the differentially expressed genes of cell signal transduction. Disorder in cell signal transduction may play certain role in the pathogenic mechanism of benzene poisoning.
Multi-task feature selection in microarray data by binary integer programming.
Lan, Liang; Vucetic, Slobodan
2013-12-20
A major challenge in microarray classification is that the number of features is typically orders of magnitude larger than the number of examples. In this paper, we propose a novel feature filter algorithm to select the feature subset with maximal discriminative power and minimal redundancy by solving a quadratic objective function with binary integer constraints. To improve the computational efficiency, the binary integer constraints are relaxed and a low-rank approximation to the quadratic term is applied. The proposed feature selection algorithm was extended to solve multi-task microarray classification problems. We compared the single-task version of the proposed feature selection algorithm with 9 existing feature selection methods on 4 benchmark microarray data sets. The empirical results show that the proposed method achieved the most accurate predictions overall. We also evaluated the multi-task version of the proposed algorithm on 8 multi-task microarray datasets. The multi-task feature selection algorithm resulted in significantly higher accuracy than when using the single-task feature selection methods.
Yamamoto, F; Yamamoto, M
2004-07-01
We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.
Principles of gene microarray data analysis.
Mocellin, Simone; Rossi, Carlo Riccardo
2007-01-01
The development of several gene expression profiling methods, such as comparative genomic hybridization (CGH), differential display, serial analysis of gene expression (SAGE), and gene microarray, together with the sequencing of the human genome, has provided an opportunity to monitor and investigate the complex cascade of molecular events leading to tumor development and progression. The availability of such large amounts of information has shifted the attention of scientists towards a nonreductionist approach to biological phenomena. High throughput technologies can be used to follow changing patterns of gene expression over time. Among them, gene microarray has become prominent because it is easier to use, does not require large-scale DNA sequencing, and allows for the parallel quantification of thousands of genes from multiple samples. Gene microarray technology is rapidly spreading worldwide and has the potential to drastically change the therapeutic approach to patients affected with tumor. Therefore, it is of paramount importance for both researchers and clinicians to know the principles underlying the analysis of the huge amount of data generated with microarray technology.
Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B
2005-04-06
Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.
Palacín, Arantxa; Gómez-Casado, Cristina; Rivas, Luis A.; Aguirre, Jacobo; Tordesillas, Leticia; Bartra, Joan; Blanco, Carlos; Carrillo, Teresa; Cuesta-Herranz, Javier; de Frutos, Consolación; Álvarez-Eire, Genoveva García; Fernández, Francisco J.; Gamboa, Pedro; Muñoz, Rosa; Sánchez-Monge, Rosa; Sirvent, Sofía; Torres, María J.; Varela-Losada, Susana; Rodríguez, Rosalía; Parro, Victor; Blanca, Miguel; Salcedo, Gabriel; Díaz-Perales, Araceli
2012-01-01
The study of cross-reactivity in allergy is key to both understanding. the allergic response of many patients and providing them with a rational treatment In the present study, protein microarrays and a co-sensitization graph approach were used in conjunction with an allergen microarray immunoassay. This enabled us to include a wide number of proteins and a large number of patients, and to study sensitization profiles among members of the LTP family. Fourteen LTPs from the most frequent plant food-induced allergies in the geographical area studied were printed into a microarray specifically designed for this research. 212 patients with fruit allergy and 117 food-tolerant pollen allergic subjects were recruited from seven regions of Spain with different pollen profiles, and their sera were tested with allergen microarray. This approach has proven itself to be a good tool to study cross-reactivity between members of LTP family, and could become a useful strategy to analyze other families of allergens. PMID:23272072
Stochastic models for inferring genetic regulation from microarray gene expression data.
Tian, Tianhai
2010-03-01
Microarray expression profiles are inherently noisy and many different sources of variation exist in microarray experiments. It is still a significant challenge to develop stochastic models to realize noise in microarray expression profiles, which has profound influence on the reverse engineering of genetic regulation. Using the target genes of the tumour suppressor gene p53 as the test problem, we developed stochastic differential equation models and established the relationship between the noise strength of stochastic models and parameters of an error model for describing the distribution of the microarray measurements. Numerical results indicate that the simulated variance from stochastic models with a stochastic degradation process can be represented by a monomial in terms of the hybridization intensity and the order of the monomial depends on the type of stochastic process. The developed stochastic models with multiple stochastic processes generated simulations whose variance is consistent with the prediction of the error model. This work also established a general method to develop stochastic models from experimental information. 2009 Elsevier Ireland Ltd. All rights reserved.
117.6-kilobit telemetry from Mercury in-flight system analysis
NASA Technical Reports Server (NTRS)
Evanchuk, V. L.
1974-01-01
This paper discusses very specifically the mode of the Mariner Venus/Mercury 1973 (MVM'73) telecommunications system in the interplexed dual channel 117.6 kilobits per second (kbps) and 2.45 kbps telemetry. This mode, originally designed for only Venus encounter, was also used at Mercury despite significantly less performance margin. Detailed analysis and careful measurement of system performance before and during flight operations allowed critical operational decisions, which made maximum use of the system capabilities.
Insurgent Design: The Re-Emergence of Al-Qaida from 9/11 to the Present
2015-12-01
unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Analysts disagree on how to characterize al-Qa’ida’s evolution. One...ABSTRACT Analysts disagree on how to characterize al-Qa’ida’s evolution. One perspective regards jihadi-Islamism in general to be self-marginalizing. A...impossible. I would also like to thank Dr. Craig Whiteside and Dr. Siamak Naficy, both of whom provided critical input to my research methodology and
[Peritumoral hemorrhage immediately after radiosurgery for metastatic brain tumor].
Uchino, Masafumi; Kitajima, Satoru; Miyazaki, Chikao; Otsuka, Takashi; Seiki, Yoshikatsu; Shibata, Iekado
2003-08-01
We report a case of a 44-year-old woman with metastatic brain tumors who suffered peri-tumoral hemorrhage soon after stereotactic radiosurgery (SRS). She had been suffering from breast cancer with multiple systemic metastasis. She started to have headache, nausea, dizziness and speech disturbance 1 month before admission. There was no bleeding tendency in the hematological examination and the patient was normotensive. Neurological examination disclosed headache and slightly aphasia. Magnetic resonance imaging showed a large round mass lesion in the left temporal lobe. It was a well-demarcated, highly enhanced mass, 45 mm in diameter. SRS was performed on four lesions in a single session (Main mass: maximum dose was 30 Gy in the center and 20 Gy in the margin of the tumor. Others: maximum 25 Gy margin 20 Gy). After radiosurgery, she had severe headache, nausea and vomiting and showed progression of aphasia. CT scan revealed a peritumoral hemorrhage. Conservative therapy was undertaken and the patient's symptoms improved. After 7 days, she was discharged, able to walk. The patient died of extensive distant metastasis 5 months after SRS. Acute transient swelling following conventional radiotherapy is a well-documented phenomenon. However, the present case indicates that such an occurrence is also possible in SRS. We have hypothesized that acute reactions such as brain swelling occur due to breakdown of the fragile vessels of the tumor or surrounding tissue.
Large-scale analysis of gene expression using cDNA microarrays promises the
rapid detection of the mode of toxicity for drugs and other chemicals. cDNA
microarrays were used to examine chemically-induced alterations of gene
expression in HepG2 cells exposed to oxidative ...
Where statistics and molecular microarray experiments biology meet.
Kelmansky, Diana M
2013-01-01
This review chapter presents a statistical point of view to microarray experiments with the purpose of understanding the apparent contradictions that often appear in relation to their results. We give a brief introduction of molecular biology for nonspecialists. We describe microarray experiments from their construction and the biological principles the experiments rely on, to data acquisition and analysis. The role of epidemiological approaches and sample size considerations are also discussed.
The objective of this study is to develop a microarray to test for cyanobacteria and cyanotoxin genes in drinking water reservoirs as an aid to risk assessment and manages of water supplies. The microarray will include probes recognizing important freshwater cyanobacterial tax...
Chao, Jie; Li, Zhenhua; Li, Jing; Peng, Hongzhen; Su, Shao; Li, Qian; Zhu, Changfeng; Zuo, Xiaolei; Song, Shiping; Wang, Lianhui; Wang, Lihua
2016-07-15
Microarrays of biomolecules hold great promise in the fields of genomics, proteomics, and clinical assays on account of their remarkably parallel and high-throughput assay capability. However, the fluorescence detection used in most conventional DNA microarrays is still limited by sensitivity. In this study, we have demonstrated a novel universal and highly sensitive platform for fluorescent detection of sequence specific DNA at the femtomolar level by combining dextran-coated microarrays with hybridization chain reaction (HCR) signal amplification. Three-dimensional dextran matrix was covalently coated on glass surface as the scaffold to immobilize DNA recognition probes to increase the surface binding capacity and accessibility. DNA nanowire tentacles were formed on the matrix surface for efficient signal amplification by capturing multiple fluorescent molecules in a highly ordered way. By quantifying microscopic fluorescent signals, the synergetic effects of dextran and HCR greatly improved sensitivity of DNA microarrays, with a detection limit of 10fM (1×10(5) molecules). This detection assay could recognize one-base mismatch with fluorescence signals dropped down to ~20%. This cost-effective microarray platform also worked well with samples in serum and thus shows great potential for clinical diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.
Cole, Steve W; Galic, Zoran; Zack, Jerome A
2003-09-22
Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus
Nanotechnology: moving from microarrays toward nanoarrays.
Chen, Hua; Li, Jun
2007-01-01
Microarrays are important tools for high-throughput analysis of biomolecules. The use of microarrays for parallel screening of nucleic acid and protein profiles has become an industry standard. A few limitations of microarrays are the requirement for relatively large sample volumes and elongated incubation time, as well as the limit of detection. In addition, traditional microarrays make use of bulky instrumentation for the detection, and sample amplification and labeling are quite laborious, which increase analysis cost and delays the time for obtaining results. These problems limit microarray techniques from point-of-care and field applications. One strategy for overcoming these problems is to develop nanoarrays, particularly electronics-based nanoarrays. With further miniaturization, higher sensitivity, and simplified sample preparation, nanoarrays could potentially be employed for biomolecular analysis in personal healthcare and monitoring of trace pathogens. In this chapter, it is intended to introduce the concept and advantage of nanotechnology and then describe current methods and protocols for novel nanoarrays in three aspects: (1) label-free nucleic acids analysis using nanoarrays, (2) nanoarrays for protein detection by conventional optical fluorescence microscopy as well as by novel label-free methods such as atomic force microscopy, and (3) nanoarray for enzymatic-based assay. These nanoarrays will have significant applications in drug discovery, medical diagnosis, genetic testing, environmental monitoring, and food safety inspection.
Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina
2006-06-01
Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.
Janse, Ingmar; Bok, Jasper M.; Hamidjaja, Raditijo A.; Hodemaekers, Hennie M.; van Rotterdam, Bart J.
2012-01-01
Microarrays provide a powerful analytical tool for the simultaneous detection of multiple pathogens. We developed diagnostic suspension microarrays for sensitive and specific detection of the biothreat pathogens Bacillus anthracis, Yersinia pestis, Francisella tularensis and Coxiella burnetii. Two assay chemistries for amplification and labeling were developed, one method using direct hybridization and the other using target-specific primer extension, combined with hybridization to universal arrays. Asymmetric PCR products for both assay chemistries were produced by using a multiplex asymmetric PCR amplifying 16 DNA signatures (16-plex). The performances of both assay chemistries were compared and their advantages and disadvantages are discussed. The developed microarrays detected multiple signature sequences and an internal control which made it possible to confidently identify the targeted pathogens and assess their virulence potential. The microarrays were highly specific and detected various strains of the targeted pathogens. Detection limits for the different pathogen signatures were similar or slightly higher compared to real-time PCR. Probit analysis showed that even a few genomic copies could be detected with 95% confidence. The microarrays detected DNA from different pathogens mixed in different ratios and from spiked or naturally contaminated samples. The assays that were developed have a potential for application in surveillance and diagnostics. PMID:22355407
[Typing and subtyping avian influenza virus using DNA microarrays].
Yang, Zhongping; Wang, Xiurong; Tian, Lina; Wang, Yu; Chen, Hualan
2008-07-01
Outbreaks of highly pathogenic avian influenza (HPAI) virus has caused great economic loss to the poultry industry and resulted in human deaths in Thailand and Vietnam since 2004. Rapid typing and subtyping of viruses, especially HPAI from clinical specimens, are desirable for taking prompt control measures to prevent spreading of the disease. We described a simultaneous approach using microarray to detect and subtype avian influenza virus (AIV). We designed primers of probe genes and used reverse transcriptase PCR to prepare cDNAs of AIV M gene, H5, H7, H9 subtypes haemagglutinin genes and N1, N2 subtypes neuraminidase genes. They were cloned, sequenced, reamplified and spotted to form a glass-bound microarrays. We labeled samples using Cy3-dUTP by RT-PCR, hybridized and scanned the microarrays to typing and subtyping AIV. The hybridization pattern agreed perfectly with the known grid location of each probe, no cross hybridization could be detected. Examinating of HA subtypes 1 through 15, 30 infected samples and 21 field samples revealed the DNA microarray assay was more sensitive and specific than RT-PCR test and chicken embryo inoculation. It can simultaneously detect and differentiate the main epidemic AIV. The results show that DNA microarray technology is a useful diagnostic method.
Grenville-Briggs, Laura J; Stansfield, Ian
2011-01-01
This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate active learning through experience of current research methods in bioinformatics and functional genomics. They seek to closely mimic a realistic research environment, and require the students first to propose research hypotheses, then test those hypotheses using specific sections of the microarray dataset. The complexity of the microarray data provides students with the freedom to propose their own unique hypotheses, tested using appropriate sections of the microarray data. This research latitude was highly regarded by students and is a strength of this practical. In addition, the focus on DNA damage by radiation and mutagenic chemicals allows them to place their results in a human medical context, and successfully sparks broad interest in the subject material. In evaluation, 79% of students scored the practical workshops on a five-point scale as 4 or 5 (totally effective) for student learning. More broadly, the general use of microarray data as a "student research playground" is also discussed. Copyright © 2011 Wiley Periodicals, Inc.
2010-01-01
Background Recent developments in high-throughput methods of analyzing transcriptomic profiles are promising for many areas of biology, including ecophysiology. However, although commercial microarrays are available for most common laboratory models, transcriptome analysis in non-traditional model species still remains a challenge. Indeed, the signal resulting from heterologous hybridization is low and difficult to interpret because of the weak complementarity between probe and target sequences, especially when no microarray dedicated to a genetically close species is available. Results We show here that transcriptome analysis in a species genetically distant from laboratory models is made possible by using MAXRS, a new method of analyzing heterologous hybridization on microarrays. This method takes advantage of the design of several commercial microarrays, with different probes targeting the same transcript. To illustrate and test this method, we analyzed the transcriptome of king penguin pectoralis muscle hybridized to Affymetrix chicken microarrays, two organisms separated by an evolutionary distance of approximately 100 million years. The differential gene expression observed between different physiological situations computed by MAXRS was confirmed by real-time PCR on 10 genes out of 11 tested. Conclusions MAXRS appears to be an appropriate method for gene expression analysis under heterologous hybridization conditions. PMID:20509979
Microarray platform affords improved product analysis in mammalian cell growth studies
Li, Lingyun; Migliore, Nicole; Schaefer, Eugene; Sharfstein, Susan T.; Dordick, Jonathan S.; Linhardt, Robert J.
2014-01-01
High throughput (HT) platforms serve as cost-efficient and rapid screening method for evaluating the effect of cell culture conditions and screening of chemicals. The aim of the current study was to develop a high-throughput cell-based microarray platform to assess the effect of culture conditions on Chinese hamster ovary (CHO) cells. Specifically, growth, transgene expression and metabolism of a GS/MSX CHO cell line, which produces a therapeutic monoclonal antibody, was examined using microarray system in conjunction with conventional shake flask platform in a non-proprietary medium. The microarray system consists of 60 nl spots of cells encapsulated in alginate and separated in groups via an 8-well chamber system attached to the chip. Results show the non-proprietary medium developed allows cell growth, production and normal glycosylation of recombinant antibody and metabolism of the recombinant CHO cells in both the microarray and shake flask platforms. In addition, 10.3 mM glutamate addition to the defined base media results in lactate metabolism shift in the recombinant GS/MSX CHO cells in the shake flask platform. Ultimately, the results demonstrate that the high-throughput microarray platform has the potential to be utilized for evaluating the impact of media additives on cellular processes, such as, cell growth, metabolism and productivity. PMID:24227746
Janse, Ingmar; Bok, Jasper M; Hamidjaja, Raditijo A; Hodemaekers, Hennie M; van Rotterdam, Bart J
2012-01-01
Microarrays provide a powerful analytical tool for the simultaneous detection of multiple pathogens. We developed diagnostic suspension microarrays for sensitive and specific detection of the biothreat pathogens Bacillus anthracis, Yersinia pestis, Francisella tularensis and Coxiella burnetii. Two assay chemistries for amplification and labeling were developed, one method using direct hybridization and the other using target-specific primer extension, combined with hybridization to universal arrays. Asymmetric PCR products for both assay chemistries were produced by using a multiplex asymmetric PCR amplifying 16 DNA signatures (16-plex). The performances of both assay chemistries were compared and their advantages and disadvantages are discussed. The developed microarrays detected multiple signature sequences and an internal control which made it possible to confidently identify the targeted pathogens and assess their virulence potential. The microarrays were highly specific and detected various strains of the targeted pathogens. Detection limits for the different pathogen signatures were similar or slightly higher compared to real-time PCR. Probit analysis showed that even a few genomic copies could be detected with 95% confidence. The microarrays detected DNA from different pathogens mixed in different ratios and from spiked or naturally contaminated samples. The assays that were developed have a potential for application in surveillance and diagnostics.
Kawaura, Kanako; Mochida, Keiichi; Yamazaki, Yukiko; Ogihara, Yasunari
2006-04-01
In this study, we constructed a 22k wheat oligo-DNA microarray. A total of 148,676 expressed sequence tags of common wheat were collected from the database of the Wheat Genomics Consortium of Japan. These were grouped into 34,064 contigs, which were then used to design an oligonucleotide DNA microarray. Following a multistep selection of the sense strand, 21,939 60-mer oligo-DNA probes were selected for attachment on the microarray slide. This 22k oligo-DNA microarray was used to examine the transcriptional response of wheat to salt stress. More than 95% of the probes gave reproducible hybridization signals when targeted with RNAs extracted from salt-treated wheat shoots and roots. With the microarray, we identified 1,811 genes whose expressions changed more than 2-fold in response to salt. These included genes known to mediate response to salt, as well as unknown genes, and they were classified into 12 major groups by hierarchical clustering. These gene expression patterns were also confirmed by real-time reverse transcription-PCR. Many of the genes with unknown function were clustered together with genes known to be involved in response to salt stress. Thus, analysis of gene expression patterns combined with gene ontology should help identify the function of the unknown genes. Also, functional analysis of these wheat genes should provide new insight into the response to salt stress. Finally, these results indicate that the 22k oligo-DNA microarray is a reliable method for monitoring global gene expression patterns in wheat.
Identification of new autoantigens for primary biliary cirrhosis using human proteome microarrays.
Hu, Chao-Jun; Song, Guang; Huang, Wei; Liu, Guo-Zhen; Deng, Chui-Wen; Zeng, Hai-Pan; Wang, Li; Zhang, Feng-Chun; Zhang, Xuan; Jeong, Jun Seop; Blackshaw, Seth; Jiang, Li-Zhi; Zhu, Heng; Wu, Lin; Li, Yong-Zhe
2012-09-01
Primary biliary cirrhosis (PBC) is a chronic cholestatic liver disease of unknown etiology and is considered to be an autoimmune disease. Autoantibodies are important tools for accurate diagnosis of PBC. Here, we employed serum profiling analysis using a human proteome microarray composed of about 17,000 full-length unique proteins and identified 23 proteins that correlated with PBC. To validate these results, we fabricated a PBC-focused microarray with 21 of these newly identified candidates and nine additional known PBC antigens. By screening the PBC microarrays with additional cohorts of 191 PBC patients and 321 controls (43 autoimmune hepatitis, 55 hepatitis B virus, 31 hepatitis C virus, 48 rheumatoid arthritis, 45 systematic lupus erythematosus, 49 systemic sclerosis, and 50 healthy), six proteins were confirmed as novel PBC autoantigens with high sensitivities and specificities, including hexokinase-1 (isoforms I and II), Kelch-like protein 7, Kelch-like protein 12, zinc finger and BTB domain-containing protein 2, and eukaryotic translation initiation factor 2C, subunit 1. To facilitate clinical diagnosis, we developed ELISA for Kelch-like protein 12 and zinc finger and BTB domain-containing protein 2 and tested large cohorts (297 PBC and 637 control sera) to confirm the sensitivities and specificities observed in the microarray-based assays. In conclusion, our research showed that a strategy using high content protein microarray combined with a smaller but more focused protein microarray can effectively identify and validate novel PBC-specific autoantigens and has the capacity to be translated to clinical diagnosis by means of an ELISA-based method.
Li, Xiang; Harwood, Valerie J.; Nayak, Bina
2016-01-01
Pathogen identification and microbial source tracking (MST) to identify sources of fecal pollution improve evaluation of water quality. They contribute to improved assessment of human health risks and remediation of pollution sources. An MST microarray was used to simultaneously detect genes for multiple pathogens and indicators of fecal pollution in freshwater, marine water, sewage-contaminated freshwater and marine water, and treated wastewater. Dead-end ultrafiltration (DEUF) was used to concentrate organisms from water samples, yielding a recovery efficiency of >95% for Escherichia coli and human polyomavirus. Whole-genome amplification (WGA) increased gene copies from ultrafiltered samples and increased the sensitivity of the microarray. Viruses (adenovirus, bocavirus, hepatitis A virus, and human polyomaviruses) were detected in sewage-contaminated samples. Pathogens such as Legionella pneumophila, Shigella flexneri, and Campylobacter fetus were detected along with genes conferring resistance to aminoglycosides, beta-lactams, and tetracycline. Nonmetric dimensional analysis of MST marker genes grouped sewage-spiked freshwater and marine samples with sewage and apart from other fecal sources. The sensitivity (percent true positives) of the microarray probes for gene targets anticipated in sewage was 51 to 57% and was lower than the specificity (percent true negatives; 79 to 81%). A linear relationship between gene copies determined by quantitative PCR and microarray fluorescence was found, indicating the semiquantitative nature of the MST microarray. These results indicate that ultrafiltration coupled with WGA provides sufficient nucleic acids for detection of viruses, bacteria, protozoa, and antibiotic resistance genes by the microarray in applications ranging from beach monitoring to risk assessment. PMID:26729716
MADGE: scalable distributed data management software for cDNA microarrays.
McIndoe, Richard A; Lanzen, Aaron; Hurtz, Kimberly
2003-01-01
The human genome project and the development of new high-throughput technologies have created unparalleled opportunities to study the mechanism of diseases, monitor the disease progression and evaluate effective therapies. Gene expression profiling is a critical tool to accomplish these goals. The use of nucleic acid microarrays to assess the gene expression of thousands of genes simultaneously has seen phenomenal growth over the past five years. Although commercial sources of microarrays exist, investigators wanting more flexibility in the genes represented on the array will turn to in-house production. The creation and use of cDNA microarrays is a complicated process that generates an enormous amount of information. Effective data management of this information is essential to efficiently access, analyze, troubleshoot and evaluate the microarray experiments. We have developed a distributable software package designed to track and store the various pieces of data generated by a cDNA microarray facility. This includes the clone collection storage data, annotation data, workflow queues, microarray data, data repositories, sample submission information, and project/investigator information. This application was designed using a 3-tier client server model. The data access layer (1st tier) contains the relational database system tuned to support a large number of transactions. The data services layer (2nd tier) is a distributed COM server with full database transaction support. The application layer (3rd tier) is an internet based user interface that contains both client and server side code for dynamic interactions with the user. This software is freely available to academic institutions and non-profit organizations at http://www.genomics.mcg.edu/niddkbtc.
2012-01-01
Background High-resolution genetic maps are needed in many crops to help characterize the genetic diversity that determines agriculturally important traits. Hybridization to microarrays to detect single feature polymorphisms is a powerful technique for marker discovery and genotyping because of its highly parallel nature. However, microarrays designed for gene expression analysis rarely provide sufficient gene coverage for optimal detection of nucleotide polymorphisms, which limits utility in species with low rates of polymorphism such as lettuce (Lactuca sativa). Results We developed a 6.5 million feature Affymetrix GeneChip® for efficient polymorphism discovery and genotyping, as well as for analysis of gene expression in lettuce. Probes on the microarray were designed from 26,809 unigenes from cultivated lettuce and an additional 8,819 unigenes from four related species (L. serriola, L. saligna, L. virosa and L. perennis). Where possible, probes were tiled with a 2 bp stagger, alternating on each DNA strand; providing an average of 187 probes covering approximately 600 bp for each of over 35,000 unigenes; resulting in up to 13 fold redundancy in coverage per nucleotide. We developed protocols for hybridization of genomic DNA to the GeneChip® and refined custom algorithms that utilized coverage from multiple, high quality probes to detect single position polymorphisms in 2 bp sliding windows across each unigene. This allowed us to detect greater than 18,000 polymorphisms between the parental lines of our core mapping population, as well as numerous polymorphisms between cultivated lettuce and wild species in the lettuce genepool. Using marker data from our diversity panel comprised of 52 accessions from the five species listed above, we were able to separate accessions by species using both phylogenetic and principal component analyses. Additionally, we estimated the diversity between different types of cultivated lettuce and distinguished morphological types. Conclusion By hybridizing genomic DNA to a custom oligonucleotide array designed for maximum gene coverage, we were able to identify polymorphisms using two approaches for pair-wise comparisons, as well as a highly parallel method that compared all 52 genotypes simultaneously. PMID:22583801
Stoffel, Kevin; van Leeuwen, Hans; Kozik, Alexander; Caldwell, David; Ashrafi, Hamid; Cui, Xinping; Tan, Xiaoping; Hill, Theresa; Reyes-Chin-Wo, Sebastian; Truco, Maria-Jose; Michelmore, Richard W; Van Deynze, Allen
2012-05-14
High-resolution genetic maps are needed in many crops to help characterize the genetic diversity that determines agriculturally important traits. Hybridization to microarrays to detect single feature polymorphisms is a powerful technique for marker discovery and genotyping because of its highly parallel nature. However, microarrays designed for gene expression analysis rarely provide sufficient gene coverage for optimal detection of nucleotide polymorphisms, which limits utility in species with low rates of polymorphism such as lettuce (Lactuca sativa). We developed a 6.5 million feature Affymetrix GeneChip® for efficient polymorphism discovery and genotyping, as well as for analysis of gene expression in lettuce. Probes on the microarray were designed from 26,809 unigenes from cultivated lettuce and an additional 8,819 unigenes from four related species (L. serriola, L. saligna, L. virosa and L. perennis). Where possible, probes were tiled with a 2 bp stagger, alternating on each DNA strand; providing an average of 187 probes covering approximately 600 bp for each of over 35,000 unigenes; resulting in up to 13 fold redundancy in coverage per nucleotide. We developed protocols for hybridization of genomic DNA to the GeneChip® and refined custom algorithms that utilized coverage from multiple, high quality probes to detect single position polymorphisms in 2 bp sliding windows across each unigene. This allowed us to detect greater than 18,000 polymorphisms between the parental lines of our core mapping population, as well as numerous polymorphisms between cultivated lettuce and wild species in the lettuce genepool. Using marker data from our diversity panel comprised of 52 accessions from the five species listed above, we were able to separate accessions by species using both phylogenetic and principal component analyses. Additionally, we estimated the diversity between different types of cultivated lettuce and distinguished morphological types. By hybridizing genomic DNA to a custom oligonucleotide array designed for maximum gene coverage, we were able to identify polymorphisms using two approaches for pair-wise comparisons, as well as a highly parallel method that compared all 52 genotypes simultaneously.
Li, Jun; Fu, Cuizhang; Lei, Guangchun
2011-01-01
Few studies have explored the role of Cenozoic tectonic evolution in shaping patterns and processes of extant animal distributions within East Asian margins. We select Hynobius salamanders (Amphibia: Hynobiidae) as a model to examine biogeographical consequences of Cenozoic tectonic events within East Asian margins. First, we use GenBank molecular data to reconstruct phylogenetic interrelationships of Hynobius by Bayesian and maximum likelihood analyses. Second, we estimate the divergence time using the Bayesian relaxed clock approach and infer dispersal/vicariance histories under the ‘dispersal–extinction–cladogenesis’ model. Finally, we test whether evolutionary history and biogeographical processes of Hynobius should coincide with the predictions of two major hypotheses (the ‘vicariance’/‘out of southwestern Japan’ hypothesis). The resulting phylogeny confirmed Hynobius as a monophyletic group, which could be divided into nine major clades associated with six geographical areas. Our results show that: (1) the most recent common ancestor of Hynobius was distributed in southwestern Japan and Hokkaido Island, (2) a sister taxon relationship between Hynobius retardatus and all remaining species was the results of a vicariance event between Hokkaido Island and southwestern Japan in the Middle Eocene, (3) ancestral Hynobius in southwestern Japan dispersed into the Taiwan Island, central China, ‘Korean Peninsula and northeastern China’ as well as northeastern Honshu during the Late Eocene–Late Miocene. Our findings suggest that Cenozoic tectonic evolution plays an important role in shaping disjunctive distributions of extant Hynobius within East Asian margins. PMID:21738684
Maslow, Bat-Sheva L; Budinetz, Tara; Sueldo, Carolina; Anspach, Erica; Engmann, Lawrence; Benadiva, Claudio; Nulsen, John C
2015-07-01
To compare the analysis of chromosome number from paraffin-embedded products of conception using single-nucleotide polymorphism (SNP) microarray with the recommended screening for the evaluation of couples presenting with recurrent pregnancy loss who do not have previous fetal cytogenetic data. We performed a retrospective cohort study including all women who presented for a new evaluation of recurrent pregnancy loss over a 2-year period (January 1, 2012, to December 31, 2013). All participants had at least two documented first-trimester losses and both the recommended screening tests and SNP microarray performed on at least one paraffin-embedded products of conception sample. Single-nucleotide polymorphism microarray identifies all 24 chromosomes (22 autosomes, X, and Y). Forty-two women with a total of 178 losses were included in the study. Paraffin-embedded products of conception from 62 losses were sent for SNP microarray. Single-nucleotide polymorphism microarray successfully diagnosed fetal chromosome number in 71% (44/62) of samples, of which 43% (19/44) were euploid and 57% (25/44) were noneuploid. Seven of 42 (17%) participants had abnormalities on recurrent pregnancy loss screening. The per-person detection rate for a cause of pregnancy loss was significantly higher in the SNP microarray (0.50; 95% confidence interval [CI] 0.36-0.64) compared with recurrent pregnancy loss evaluation (0.17; 95% CI 0.08-0.31) (P=.002). Participants with one or more euploid loss identified on paraffin-embedded products of conception were significantly more likely to have an abnormality on recurrent pregnancy loss screening than those with only noneuploid results (P=.028). The significance remained when controlling for age, number of losses, number of samples, and total pregnancies. These results suggest that SNP microarray testing of paraffin-embedded products of conception is a valuable tool for the evaluation of recurrent pregnancy loss in patients without prior fetal cytogenetic results. Recommended recurrent pregnancy loss screening was unnecessary in almost half the patients in our study. II.
Zeller, Tanja; Wild, Philipp S.; Truong, Vinh; Trégouët, David-Alexandre; Munzel, Thomas; Ziegler, Andreas; Cambien, François; Blankenberg, Stefan; Tiret, Laurence
2011-01-01
Background The hypothesis of dosage compensation of genes of the X chromosome, supported by previous microarray studies, was recently challenged by RNA-sequencing data. It was suggested that microarray studies were biased toward an over-estimation of X-linked expression levels as a consequence of the filtering of genes below the detection threshold of microarrays. Methodology/Principal Findings To investigate this hypothesis, we used microarray expression data from circulating monocytes in 1,467 individuals. In total, 25,349 and 1,156 probes were unambiguously assigned to autosomes and the X chromosome, respectively. Globally, there was a clear shift of X-linked expressions toward lower levels than autosomes. We compared the ratio of expression levels of X-linked to autosomal transcripts (X∶AA) using two different filtering methods: 1. gene expressions were filtered out using a detection threshold irrespective of gene chromosomal location (the standard method in microarrays); 2. equal proportions of genes were filtered out separately on the X and on autosomes. For a wide range of filtering proportions, the X∶AA ratio estimated with the first method was not significantly different from 1, the value expected if dosage compensation was achieved, whereas it was significantly lower than 1 with the second method, leading to the rejection of the hypothesis of dosage compensation. We further showed in simulated data that the choice of the most appropriate method was dependent on biological assumptions regarding the proportion of actively expressed genes on the X chromosome comparative to the autosomes and the extent of dosage compensation. Conclusion/Significance This study shows that the method used for filtering out lowly expressed genes in microarrays may have a major impact according to the hypothesis investigated. The hypothesis of dosage compensation of X-linked genes cannot be firmly accepted or rejected using microarray-based data. PMID:21912656
Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient.
Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J
2008-06-18
Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. This study shows that SCC is an alternative to the Pearson correlation coefficient and the SD-weighted correlation coefficient, and is particularly useful for clustering replicated microarray data. This computational approach should be generally useful for proteomic data or other high-throughput analysis methodology.
Computational synchronization of microarray data with application to Plasmodium falciparum.
Zhao, Wei; Dauwels, Justin; Niles, Jacquin C; Cao, Jianshu
2012-06-21
Microarrays are widely used to investigate the blood stage of Plasmodium falciparum infection. Starting with synchronized cells, gene expression levels are continually measured over the 48-hour intra-erythrocytic cycle (IDC). However, the cell population gradually loses synchrony during the experiment. As a result, the microarray measurements are blurred. In this paper, we propose a generalized deconvolution approach to reconstruct the intrinsic expression pattern, and apply it to P. falciparum IDC microarray data. We develop a statistical model for the decay of synchrony among cells, and reconstruct the expression pattern through statistical inference. The proposed method can handle microarray measurements with noise and missing data. The original gene expression patterns become more apparent in the reconstructed profiles, making it easier to analyze and interpret the data. We hypothesize that reconstructed gene expression patterns represent better temporally resolved expression profiles that can be probabilistically modeled to match changes in expression level to IDC transitions. In particular, we identify transcriptionally regulated protein kinases putatively involved in regulating the P. falciparum IDC. By analyzing publicly available microarray data sets for the P. falciparum IDC, protein kinases are ranked in terms of their likelihood to be involved in regulating transitions between the ring, trophozoite and schizont developmental stages of the P. falciparum IDC. In our theoretical framework, a few protein kinases have high probability rankings, and could potentially be involved in regulating these developmental transitions. This study proposes a new methodology for extracting intrinsic expression patterns from microarray data. By applying this method to P. falciparum microarray data, several protein kinases are predicted to play a significant role in the P. falciparum IDC. Earlier experiments have indeed confirmed that several of these kinases are involved in this process. Overall, these results indicate that further functional analysis of these additional putative protein kinases may reveal new insights into how the P. falciparum IDC is regulated.
Schüler, Susann; Wenz, Ingrid; Wiederanders, B; Slickers, P; Ehricht, R
2006-06-12
Recent developments in DNA microarray technology led to a variety of open and closed devices and systems including high and low density microarrays for high-throughput screening applications as well as microarrays of lower density for specific diagnostic purposes. Beside predefined microarrays for specific applications manufacturers offer the production of custom-designed microarrays adapted to customers' wishes. Array based assays demand complex procedures including several steps for sample preparation (RNA extraction, amplification and sample labelling), hybridization and detection, thus leading to a high variability between several approaches and resulting in the necessity of extensive standardization and normalization procedures. In the present work a custom designed human proteinase DNA microarray of lower density in ArrayTube format was established. This highly economic open platform only requires standard laboratory equipment and allows the study of the molecular regulation of cell behaviour by proteinases. We established a procedure for sample preparation and hybridization and verified the array based gene expression profile by quantitative real-time PCR (QRT-PCR). Moreover, we compared the results with the well established Affymetrix microarray. By application of standard labelling procedures with e.g. Klenow fragment exo-, single primer amplification (SPA) or In Vitro Transcription (IVT) we noticed a loss of signal conservation for some genes. To overcome this problem we developed a protocol in accordance with the SPA protocol, in which we included target specific primers designed individually for each spotted oligomer. Here we present a complete array based assay in which only the specific transcripts of interest are amplified in parallel and in a linear manner. The array represents a proof of principle which can be adapted to other species as well. As the designed protocol for amplifying mRNA starts from as little as 100 ng total RNA, it presents an alternative method for detecting even low expressed genes by microarray experiments in a highly reproducible and sensitive manner. Preservation of signal integrity is demonstrated out by QRT-PCR measurements. The little amounts of total RNA necessary for the analyses make this method applicable for investigations with limited material as in clinical samples from, for example, organ or tumour biopsies. Those are arguments in favour of the high potential of our assay compared to established procedures for amplification within the field of diagnostic expression profiling. Nevertheless, the screening character of microarray data must be mentioned, and independent methods should verify the results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Willse, Alan R.
The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.
High-Throughput Nano-Biofilm Microarray for Antifungal Drug Discovery
2013-06-25
High-Throughput Nano-Biofilm Microarray for Antifungal Drug Discovery Anand Srinivasan,a, c Kai P. Leung,d Jose L. Lopez-Ribot,b, c Anand K...Ramasubramaniana, c Departments of Biomedical Engineeringa and Biologyb and South Texas Center for Emerging Infectious Diseases, c The University of Texas at San...of the opportunistic fungal pathogen Candida albicans on a microarray platform. The mi- croarray consists of 1,200 individual cultures of 30 nl of C
A Protein Microarray ELISA for the Detection of Botulinum neurotoxin A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varnum, Susan M.
An enzyme-linked immunosorbent assay (ELISA) microarray was developed for the specific and sensitive detection of botulinum neurotoxin A (BoNT/A), using high-affinity recombinant monoclonal antibodies against the receptor binding domain of the heavy chain of BoNT/A. The ELISA microarray assay, because of its sensitivity, offers a screening test with detection limits comparable to the mouse bioassay, with results available in hours instead of days.
Applications of nanotechnology, next generation sequencing and microarrays in biomedical research.
Elingaramil, Sauli; Li, Xiaolong; He, Nongyue
2013-07-01
Next-generation sequencing technologies, microarrays and advances in bio nanotechnology have had an enormous impact on research within a short time frame. This impact appears certain to increase further as many biomedical institutions are now acquiring these prevailing new technologies. Beyond conventional sampling of genome content, wide-ranging applications are rapidly evolving for next-generation sequencing, microarrays and nanotechnology. To date, these technologies have been applied in a variety of contexts, including whole-genome sequencing, targeted re sequencing and discovery of transcription factor binding sites, noncoding RNA expression profiling and molecular diagnostics. This paper thus discusses current applications of nanotechnology, next-generation sequencing technologies and microarrays in biomedical research and highlights the transforming potential these technologies offer.
Polyadenylation state microarray (PASTA) analysis.
Beilharz, Traude H; Preiss, Thomas
2011-01-01
Nearly all eukaryotic mRNAs terminate in a poly(A) tail that serves important roles in mRNA utilization. In the cytoplasm, the poly(A) tail promotes both mRNA stability and translation, and these functions are frequently regulated through changes in tail length. To identify the scope of poly(A) tail length control in a transcriptome, we developed the polyadenylation state microarray (PASTA) method. It involves the purification of mRNA based on poly(A) tail length using thermal elution from poly(U) sepharose, followed by microarray analysis of the resulting fractions. In this chapter we detail our PASTA approach and describe some methods for bulk and mRNA-specific poly(A) tail length measurements of use to monitor the procedure and independently verify the microarray data.
High-throughput screening in two dimensions: binding intensity and off-rate on a peptide microarray.
Greving, Matthew P; Belcher, Paul E; Cox, Conor D; Daniel, Douglas; Diehnelt, Chris W; Woodbury, Neal W
2010-07-01
We report a high-throughput two-dimensional microarray-based screen, incorporating both target binding intensity and off-rate, which can be used to analyze thousands of compounds in a single binding assay. Relative binding intensities and time-resolved dissociation are measured for labeled tumor necrosis factor alpha (TNF-alpha) bound to a peptide microarray. The time-resolved dissociation is fitted to a one-component exponential decay model, from which relative dissociation rates are determined for all peptides with binding intensities above background. We show that most peptides with the slowest off-rates on the microarray also have the slowest off-rates when measured by surface plasmon resonance (SPR). 2010 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, Meena S., E-mail: meena.moran@yale.edu; Yang Qifeng; Department of Breast Surgery, Qilu Hospital, Shandong University, Jinan, People's Republic of China
2011-12-01
Purpose: Vascular endothelial growth factor (VEGF) is an important protein involved in the process of angiogenesis that has been found to correlate with relapse-free and overall survival in breast cancer, predominantly in locally advanced and metastatic disease. A paucity of data is available on the prognostic implications of VEGF in early-stage breast cancer; specifically, its prognostic value for local relapse after breast-conserving therapy (BCT) is largely unknown. The purpose of our study was to assess VEGF expression in a cohort of early-stage breast cancer patients treated with BCT and to correlate the clinical and pathologic features and outcomes with overexpressionmore » of VEGF. Methods and Materials: After obtaining institutional review board approval, the paraffin specimens of 368 patients with early-stage breast cancer treated with BCT between 1975 and 2005 were constructed into tissue microarrays with twofold redundancy. The tissue microarrays were stained for VEGF and read by a trained pathologist, who was unaware of the clinical details, as positive or negative according the standard guidelines. The clinical and pathologic data, long-term outcomes, and results of VEGF staining were analyzed. Results: The median follow-up for the entire cohort was 6.5 years. VEGF expression was positive in 56 (15%) of the 368 patients. Although VEGF expression did not correlate with age at diagnosis, tumor size, nodal status, histologic type, family history, estrogen receptor/progesterone receptor status, or HER-2 status, a trend was seen toward increased VEGF expression in the black cohort (26% black vs. 13% white, p = .068). Within the margin-negative cohort, VEGF did not predict for local relapse-free survival (RFS) (96% vs. 95%), nodal RFS (100% vs. 100%), distant metastasis-free survival (91% vs. 92%), overall survival (92% vs. 97%), respectively (all p >.05). Subset analysis revealed that VEGF was highly predictive of local RFS in node-positive, margin-negative patients (86% vs. 100%, p = .029) on univariate analysis, but it did not retain its significance on multivariate analysis (hazard ratio, 2.52; 95% confidence interval, 0.804-7.920, p = .113). No other subgroups were identified in which a correlation was found between VEGF expression and local relapse. Conclusion: To our knowledge, our study is the first to assess the prognostic value of VEGF with the endpoint of local relapse in early-stage breast cancer treated with BCT, an important question given the recent increased use of targeted antiangiogenic agents in early-stage breast cancer. Our study results suggest that VEGF is not an independent predictor of local RFS after BCT, but additional, larger studies specifically analyzing the endpoint of VEGF and local relapse are warranted.« less
NASA Astrophysics Data System (ADS)
Cossa, Daniel; Durrieu de Madron, Xavier; Schäfer, Jörg; Lanceleur, Laurent; Guédron, Stéphane; Buscail, Roselyne; Thomas, Bastien; Castelle, Sabine; Naudin, Jean-Jacques
2017-02-01
Despite the ecologic and economical importance of coastal areas, the neurotoxic bioaccumulable monomethylmercury (MMHg) fluxes within the ocean margins and exchanges with the open sea remain unassessed. The aim of this paper is to address the questions of the abundance, distribution, production and exchanges of methylated mercury species (MeHgT), including MMHg and dimethylmercury (DMHg), in the waters, atmosphere and sediments of the Northwestern Mediterranean margin including the Rhône River delta, the continental shelf and its slope (Gulf of Lions) and the adjacent open sea (North Gyre). Concentrations of MeHgT ranged from <0.02 to 0.48 pmol L-1 with highest values associated with the oxygen-deficient zone of the open sea. The methylated mercury to total mercury proportion (MeHgT/HgT) increased from 2% to 4% in the Rhône River to up to 30% (averaging 18%) in the North Gyre waters, whereas, within the shelf waters, MeHgT/HgT proportions were the lowest (1-3%). We calculate that the open sea is the major source of MeHgT for the shelf waters, with an annual flux estimated at 0.68 ± 0.12 kmol a-1 (i.e., equivalent to 12% of the HgT flux). This MeHgT influx is more than 80 times the direct atmospheric deposition or the in situ net production, more than 40 times the estimated "maximum potential" annual efflux from shelf sediment, and more than 7 times that of the continental sources. In the open sea, ratios of MMHg/DMHg in waters were always <1 and minimum in the oxygen deficient zones of the water column, where MeHg concentrations are maximum. This observation supports the idea that MMHg could be a degradation product of DMHg produced from inorganic divalent Hg.
NASA Astrophysics Data System (ADS)
López-Carmona, Alicia; Kusky, Timothy M.; Santosh, M.; Abati, Jacobo
2011-01-01
The southern Alaska convergent margin contains several small belts of sedimentary and volcanic rocks metamorphosed to blueschist facies, located along the Border Ranges fault on the contact between the Wrangellia and Chugach terranes. These belts are significant in that they are the most inboard, and thus probably contain the oldest record of Triassic-Jurassic northward-directed subduction beneath Wrangellia. The Liberty Creek HP-LT schist belt is the oldest and the innermost section of the Chugach terrane. Within this belt lawsonite blueschists contains an initial high-pressure assemblage formed by lawsonite + phengite + chlorite + sphene + albite ± apatite ± carbonates and quartz. Epidote blueschists are composed of sodic, sodic-calcic and calcic amphiboles + epidote + phengite + chlorite + albite + sphene ± carbonates and quartz. P-T pseudosections computed from four representative samples constrain maximum pressures at 16 kbar and 250-280 °C for the Lawsonite-bearing blueschists, and 15 kbar and 400-500 °C for the epidote-bearing blueschists, suggesting a initial subduction stage of 50-55 km depth. The growth of late albite porphyroblasts in all samples suggests a dramatic decompression from ca. 9 kbar to 5 kbar. The Liberty Creek schists can be correlated with the Seldovia blueschist belt on the Kenai Peninsula. Metamorphism in both terranes took place in the Early Jurassic (191-192 Ma), recording an early stage of subduction beneath Wrangellia. In the nearby terranes of the same margin, the age of metamorphism records an early stage of subduction at 230 Ma. Based on this difference in age, a maximum of 40 Ma were necessary to subduct the protoliths of the Seldovia and Liberty Creek blueschists to depths of circa 50-55 km, suggesting a minimum vertical component of subduction of 1.2-1.5 cm/year.
NASA Astrophysics Data System (ADS)
Tian, X.; Buck, W. R.
2017-12-01
Seaward dipping reflectors (SDRs) are found at many rifted margins. Drilling indicates SDRs are interbedded layers of basalts and sediments. Multi-channel seismic reflection data show SDRs with various width (2 100 km), thickness (1 15 km) and dip angles (0 30). Recent studies use analytic thin plate models (AtPM) to describe plate deflections under volcanic loads. They reproduce a wide range of SDRs structures without detachment faulting. These models assume that the solidified dikes provide downward loads at the rifting center. Meanwhile, erupted lava flows and sediments fill in the flexural depression and further load the lithosphere. Because the strength of the lithosphere controls the amount and wavelength of bending, the geometries of SDRs provide a window into the strength of the lithosphere during continental rifting. We attempt to provide a quantitative mapping between the SDR geometry and the lithospheric strength and thickness during rifting. To do this, we first derive analytic solutions to two observables that are functions of effective elastic thickness (Te). One observable (Xf) is the horizontal distance for SDRs to evolve from flat layers to the maximum bent layers. Another observable is the ratio between the thickness and the tangent of the maximum slope of SDRs at Xf. We then extend the AtPM to numerical thin plate models (NtPM) with spatially restricted lava flows. AtPM and NtPM show a stable and small relative difference in terms of the two observables with different values of Te. This provides a mapping of Te between NtPM and AtPM models. We also employ a fully two-dimensional thermal-mechanical treatment with elasto-visco-plastic rheology to simulate SDRs formation. These models show that brittle yielding due to bending can reduce the Te of the lithosphere by as much as 50% of the actual brittle lithospheric thickness. Quantification of effects of plastic deformation on bending allow us to use Te to link SDRs geometries to brittle lithospheric thickness. From published seismic reflection data, we obtain a global map of Te at volcanic rifted margins that ranges from 2 12 km using the AtPM and NtPM mapping. The corresponding brittle lithospheric thickness ranges from 6 20 km. In addition, preliminary results show Te increases along a given margin with distance away from a Large Igneous Province.
Eminaga, Okyaz; Wei, Wei; Hawley, Sarah J; Auman, Heidi; Newcomb, Lisa F; Simko, Jeff; Hurtado-Coll, Antonio; Troyer, Dean A; Carroll, Peter R; Gleave, Martin E; Lin, Daniel W; Nelson, Peter S; Thompson, Ian M; True, Lawrence D; McKenney, Jesse K; Feng, Ziding; Fazli, Ladan; Brooks, James D
2016-01-01
The uncertainties inherent in clinical measures of prostate cancer (CaP) aggressiveness endorse the investigation of clinically validated tissue biomarkers. MUC1 expression has been previously reported to independently predict aggressive localized prostate cancer. We used a large cohort to validate whether MUC1 protein levels measured by immunohistochemistry (IHC) predict aggressive cancer, recurrence and survival outcomes after radical prostatectomy independent of clinical and pathological parameters. MUC1 IHC was performed on a multi-institutional tissue microarray (TMA) resource including 1,326 men with a median follow-up of 5 years. Associations with clinical and pathological parameters were tested by the Chi-square test and the Wilcoxon rank sum test. Relationships with outcome were assessed with univariable and multivariable Cox proportional hazard models and the Log-rank test. The presence of MUC1 expression was significantly associated with extracapsular extension and higher Gleason score, but not with seminal vesicle invasion, age, positive surgical margins or pre-operative serum PSA levels. In univariable analyses, positive MUC1 staining was significantly associated with a worse recurrence free survival (RFS) (HR: 1.24, CI 1.03-1.49, P = 0.02), although not with disease specific survival (DSS, P>0.5). On multivariable analyses, the presence of positive surgical margins, extracapsular extension, seminal vesicle invasion, as well as higher pre-operative PSA and increasing Gleason score were independently associated with RFS, while MUC1 expression was not. Positive MUC1 expression was not independently associated with disease specific survival (DSS), but was weakly associated with overall survival (OS). In our large, rigorously designed validation cohort, MUC1 protein expression was associated with adverse pathological features, although it was not an independent predictor of outcome after radical prostatectomy.
Trujillo, Kristina A.; Heaphy, Christopher M.; Mai, Minh; Vargas, Keith M.; Jones, Anna C.; Vo, Phung; Butler, Kimberly S.; Joste, Nancy E.; Bisoffi, Marco; Griffith, Jeffrey K
2011-01-01
Previous studies have shown that a field of genetically altered but histologically normal tissue extends 1 cm or more from the margins of human breast tumors. The extent, composition and biological significance of this field are only partially understood, but the molecular alterations in affected cells could provide mechanisms for limitless replicative capacity, genomic instability and a microenvironment that supports tumor initiation and progression. We demonstrate by microarray, qRT-PCR and immunohistochemistry a signature of differential gene expression that discriminates between patient-matched, tumor-adjacent histologically normal breast tissues located 1 cm and 5 cm from the margins of breast adenocarcinomas (TAHN-1 and TAHN-5, respectively). The signature includes genes involved in extracellular matrix remodeling, wound healing, fibrosis and epithelial to mesenchymal transition (EMT). Myofibroblasts, which are mediators of wound healing and fibrosis, and intra-lobular fibroblasts expressing MMP2, SPARC, TGF-β3, which are inducers of EMT, were both prevalent in TAHN-1 tissues, sparse in TAHN-5 tissues, and absent in normal tissues from reduction mammoplasty. Accordingly, EMT markers S100A4 and vimentin were elevated in both luminal and myoepithelial cells, and EMT markers α-smooth muscle actin and SNAIL were elevated in luminal epithelial cells of TAHN-1 tissues. These results identify cellular processes that are differentially activated between TAHN-1 and TAHN-5 breast tissues, implicate myofibroblasts as likely mediators of these processes, provide evidence that EMT is occurring in histologically normal tissues within the affected field and identify candidate biomarkers to investigate whether or how field cancerization contributes to the development of primary or recurrent breast tumors. PMID:21105047
Lezon, Timothy R.; Banavar, Jayanth R.; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V.
2006-01-01
We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems. PMID:17138668
Zadka, Łukasz; Kulus, Michał J; Kurnol, Krzysztof; Piotrowska, Aleksandra; Glatzel-Plucińska, Natalia; Jurek, Tomasz; Czuba, Magdalena; Nowak, Aleksandra; Chabowski, Mariusz; Janczak, Dariusz; Dzięgiel, Piotr
2018-05-03
Despite the widely described role of IL10 in immune response regulation during carcinogenesis, there is no established model describing the role of its receptor. The aim of this study is to elucidate the relationship between the subunit alpha of IL10 receptor (IL10RA) in the pathogenesis of colorectal cancer (CRC). The study was conducted on archived paraffin blocks of 125 CRC patients, from which tissue microarrays (TMA) were made. These were subsequently used for immunohistochemistry to assess the expression of IL10RA, IL10, phosphorylated STAT3 (pSTAT3) and the Ki67 proliferation index. The intensity of both reactions was assessed by independent researchers using two approaches: digital image analysis and the Remmele and Stegner score (IRS). To assess the possible correlations between the two investigated markers and the clinical stage of CRC, the Pearson correlation coefficient was calculated. The expression of aforementioned proteins was assessed in tumor samples, healthy surgical margins and healthy control samples, obtained from cadavers during autopsy from the Department of Forensic Medicine. Statistical analysis was conducted using Statistica ver. 13.05 software. The final analysis included 105 CRC patients with complete clinical and pathological data, for whom the expressions of IL10RA, IL10, pSTAT3 and Ki67 were assessed using two independent methods. There was a positive correlation between the IL10RA expression and Ki-67 proliferation index (R = 0.63, p < 0.001) and a negative correlation between the IL10RA expression and the clinical stage of CRC (R = -0.21, p = 0.022). IL10RA correlated positively with pSTAT3 and IL10 in neoplastic tissue and tumor margin (with p < 0.01 for all correlations). We also observed a significantly higher expression of IL10RA in healthy surgical margins when compared to the actual tumor (p = 0.023, the paired t-test). The expression of IL10 was significantly higher in tumors than in healthy intestinal endothelium from control group. The correlations between the expression of IL10RA and the proliferation index or the clinical stage of CRC seem to confirm the importance of IL10RA in the pathogenesis of CRC. The higher expression of IL10RA in healthy surgical margins than in the tumor itself may suggest that IL10RA plays a role in regulating immune response to the neoplasm. Copyright © 2018 Elsevier Ltd. All rights reserved.
Rapid Microarray Detection of DNA and Proteins in Microliter Volumes with SPR Imaging Measurements
Seefeld, Ting Hu; Zhou, Wen-Juan; Corn, Robert M.
2011-01-01
A four chamber microfluidic biochip is fabricated for the rapid detection of multiple proteins and nucleic acids from microliter volume samples with the technique of surface plasmon resonance imaging (SPRI). The 18 mm × 18 mm biochip consists of four 3 μL microfluidic chambers attached to an SF10 glass substrate, each of which contains three individually addressable SPRI gold thin film microarray elements. The twelve element (4 × 3) SPRI microarray consists of gold thin film spots (1 mm2 area; 45 nm thickness) each in individually addressable 0.5 μL volume microchannels. Microarrays of single-stranded DNA and RNA (ssDNA and ssRNA respectively) are fabricated by either chemical and/or enzymatic attachment reactions in these microchannels; the SPRI microarrays are then used to detect femtomole amounts (nanomolar concentrations) of DNA and proteins (single stranded DNA binding protein and thrombin via aptamer-protein bioaffinity interactions). Microarrays of ssRNA microarray elements were also used for the ultrasensitive detection of zeptomole amounts (femtomolar concentrations) of DNA via the technique of RNase H-amplified SPRI. Enzymatic removal of ssRNA from the surface due to the hybridization adsorption of target ssDNA is detected as a reflectivity decrease in the SPR imaging measurements. The observed reflectivity loss was proportional to the log of the target ssDNA concentration with a detection limit of 10 fM or 30 zeptomoles (18,000 molecules). This enzymatic amplified ssDNA detection method is not limited by diffusion of ssDNA to the interface, and thus is extremely fast, requiring only 200 seconds in the microliter volume format. PMID:21488682
Ling, Zhi-Qiang; Wang, Yi; Mukaisho, Kenichi; Hattori, Takanori; Tatsuta, Takeshi; Ge, Ming-Hua; Jin, Li; Mao, Wei-Min; Sugihara, Hiroyuki
2010-06-01
Tests of differentially expressed genes (DEGs) from microarray experiments are based on the null hypothesis that genes that are irrelevant to the phenotype/stimulus are expressed equally in the target and control samples. However, this strict hypothesis is not always true, as there can be several transcriptomic background differences between target and control samples, including different cell/tissue types, different cell cycle stages and different biological donors. These differences lead to increased false positives, which have little biological/medical significance. In this article, we propose a statistical framework to identify DEGs between target and control samples from expression microarray data allowing transcriptomic background differences between these samples by introducing a modified null hypothesis that the gene expression background difference is normally distributed. We use an iterative procedure to perform robust estimation of the null hypothesis and identify DEGs as outliers. We evaluated our method using our own triplicate microarray experiment, followed by validations with reverse transcription-polymerase chain reaction (RT-PCR) and on the MicroArray Quality Control dataset. The evaluations suggest that our technique (i) results in less false positive and false negative results, as measured by the degree of agreement with RT-PCR of the same samples, (ii) can be applied to different microarray platforms and results in better reproducibility as measured by the degree of DEG identification concordance both intra- and inter-platforms and (iii) can be applied efficiently with only a few microarray replicates. Based on these evaluations, we propose that this method not only identifies more reliable and biologically/medically significant DEG, but also reduces the power-cost tradeoff problem in the microarray field. Source code and binaries freely available for download at http://comonca.org.cn/fdca/resources/softwares/deg.zip.
Carlson, Ruth I; Cattet, Marc R L; Sarauer, Bryan L; Nielsen, Scott E; Boulanger, John; Stenhouse, Gordon B; Janz, David M
2016-01-01
A novel antibody-based protein microarray was developed that simultaneously determines expression of 31 stress-associated proteins in skin samples collected from free-ranging grizzly bears (Ursus arctos) in Alberta, Canada. The microarray determines proteins belonging to four broad functional categories associated with stress physiology: hypothalamic-pituitary-adrenal axis proteins, apoptosis/cell cycle proteins, cellular stress/proteotoxicity proteins and oxidative stress/inflammation proteins. Small skin samples (50-100 mg) were collected from captured bears using biopsy punches. Proteins were isolated and labelled with fluorescent dyes, with labelled protein homogenates loaded onto microarrays to hybridize with antibodies. Relative protein expression was determined by comparison with a pooled standard skin sample. The assay was sensitive, requiring 80 µg of protein per sample to be run in triplicate on the microarray. Intra-array and inter-array coefficients of variation for individual proteins were generally <10 and <15%, respectively. With one exception, there were no significant differences in protein expression among skin samples collected from the neck, forelimb, hindlimb and ear in a subsample of n = 4 bears. This suggests that remotely delivered biopsy darts could be used in future sampling. Using generalized linear mixed models, certain proteins within each functional category demonstrated altered expression with respect to differences in year, season, geographical sampling location within Alberta and bear biological parameters, suggesting that these general variables may influence expression of specific proteins in the microarray. Our goal is to apply the protein microarray as a conservation physiology tool that can detect, evaluate and monitor physiological stress in grizzly bears and other species at risk over time in response to environmental change.
Karyotype versus microarray testing for genetic abnormalities after stillbirth.
Reddy, Uma M; Page, Grier P; Saade, George R; Silver, Robert M; Thorsten, Vanessa R; Parker, Corette B; Pinar, Halit; Willinger, Marian; Stoll, Barbara J; Heim-Hall, Josefine; Varner, Michael W; Goldenberg, Robert L; Bukowski, Radek; Wapner, Ronald J; Drews-Botsch, Carolyn D; O'Brien, Barbara M; Dudley, Donald J; Levy, Brynn
2012-12-06
Genetic abnormalities have been associated with 6 to 13% of stillbirths, but the true prevalence may be higher. Unlike karyotype analysis, microarray analysis does not require live cells, and it detects small deletions and duplications called copy-number variants. The Stillbirth Collaborative Research Network conducted a population-based study of stillbirth in five geographic catchment areas. Standardized postmortem examinations and karyotype analyses were performed. A single-nucleotide polymorphism array was used to detect copy-number variants of at least 500 kb in placental or fetal tissue. Variants that were not identified in any of three databases of apparently unaffected persons were then classified into three groups: probably benign, clinical significance unknown, or pathogenic. We compared the results of karyotype and microarray analyses of samples obtained after delivery. In our analysis of samples from 532 stillbirths, microarray analysis yielded results more often than did karyotype analysis (87.4% vs. 70.5%, P<0.001) and provided better detection of genetic abnormalities (aneuploidy or pathogenic copy-number variants, 8.3% vs. 5.8%; P=0.007). Microarray analysis also identified more genetic abnormalities among 443 antepartum stillbirths (8.8% vs. 6.5%, P=0.02) and 67 stillbirths with congenital anomalies (29.9% vs. 19.4%, P=0.008). As compared with karyotype analysis, microarray analysis provided a relative increase in the diagnosis of genetic abnormalities of 41.9% in all stillbirths, 34.5% in antepartum stillbirths, and 53.8% in stillbirths with anomalies. Microarray analysis is more likely than karyotype analysis to provide a genetic diagnosis, primarily because of its success with nonviable tissue, and is especially valuable in analyses of stillbirths with congenital anomalies or in cases in which karyotype results cannot be obtained. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development.).
Chavan, Shweta S; Bauer, Michael A; Peterson, Erich A; Heuck, Christoph J; Johann, Donald J
2013-01-01
Transcriptome analysis by microarrays has produced important advances in biomedicine. For instance in multiple myeloma (MM), microarray approaches led to the development of an effective disease subtyping via cluster assignment, and a 70 gene risk score. Both enabled an improved molecular understanding of MM, and have provided prognostic information for the purposes of clinical management. Many researchers are now transitioning to Next Generation Sequencing (NGS) approaches and RNA-seq in particular, due to its discovery-based nature, improved sensitivity, and dynamic range. Additionally, RNA-seq allows for the analysis of gene isoforms, splice variants, and novel gene fusions. Given the voluminous amounts of historical microarray data, there is now a need to associate and integrate microarray and RNA-seq data via advanced bioinformatic approaches. Custom software was developed following a model-view-controller (MVC) approach to integrate Affymetrix probe set-IDs, and gene annotation information from a variety of sources. The tool/approach employs an assortment of strategies to integrate, cross reference, and associate microarray and RNA-seq datasets. Output from a variety of transcriptome reconstruction and quantitation tools (e.g., Cufflinks) can be directly integrated, and/or associated with Affymetrix probe set data, as well as necessary gene identifiers and/or symbols from a diversity of sources. Strategies are employed to maximize the annotation and cross referencing process. Custom gene sets (e.g., MM 70 risk score (GEP-70)) can be specified, and the tool can be directly assimilated into an RNA-seq pipeline. A novel bioinformatic approach to aid in the facilitation of both annotation and association of historic microarray data, in conjunction with richer RNA-seq data, is now assisting with the study of MM cancer biology.