Wu, Dongliang; Navet, Natasha; Liu, Yingchao; Uchida, Janice; Tian, Miaoying
2016-09-06
As an agriculturally important oomycete genus, Phytophthora contains a large number of destructive plant pathogens that severely threaten agricultural production and natural ecosystems. Among them is the broad host range pathogen P. palmivora, which infects many economically important plant species. An essential way to dissect their pathogenesis mechanisms is genetic modification of candidate genes, which requires effective transformation systems. Four methods were developed for transformation of Phytophthora spp., including PEG(polyethylene glycol)/CaCl2 mediated protoplast transformation, electroporation of zoospores, microprojectile bombardment and Agrobacterium-mediated transformation (AMT). Among them, AMT has many advantages over the other methods such as easy handling and mainly generating single-copy integration in the genome. An AMT method previously reported for P. infestans and P. palmivora has barely been used in oomycete research due to low success and low reproducibility. In this study, we report a simple and efficient AMT system for P. palmivora. Using this system, we were able to reproducibly generate over 40 transformants using zoospores collected from culture grown in a single 100 mm-diameter petri dish. The generated GFP transformants constitutively expressed GFP readily detectable using a fluorescence microscope. All of the transformants tested using Southern blot analysis contained a single-copy T-DNA insertion. This system is highly effective and reproducible for transformation of P. palmivora and expected to be adaptable for transformation of additional Phytophthora spp. and other oomycetes. Its establishment will greatly accelerate their functional genomic studies.
A simple transformation independent method for outlier definition.
Johansen, Martin Berg; Christensen, Peter Astrup
2018-04-10
Definition and elimination of outliers is a key element for medical laboratories establishing or verifying reference intervals (RIs). Especially as inclusion of just a few outlying observations may seriously affect the determination of the reference limits. Many methods have been developed for definition of outliers. Several of these methods are developed for the normal distribution and often data require transformation before outlier elimination. We have developed a non-parametric transformation independent outlier definition. The new method relies on drawing reproducible histograms. This is done by using defined bin sizes above and below the median. The method is compared to the method recommended by CLSI/IFCC, which uses Box-Cox transformation (BCT) and Tukey's fences for outlier definition. The comparison is done on eight simulated distributions and an indirect clinical datasets. The comparison on simulated distributions shows that without outliers added the recommended method in general defines fewer outliers. However, when outliers are added on one side the proposed method often produces better results. With outliers on both sides the methods are equally good. Furthermore, it is found that the presence of outliers affects the BCT, and subsequently affects the determined limits of current recommended methods. This is especially seen in skewed distributions. The proposed outlier definition reproduced current RI limits on clinical data containing outliers. We find our simple transformation independent outlier detection method as good as or better than the currently recommended methods.
Transformation of the rodent malaria parasite Plasmodium chabaudi.
Spence, Philip J; Cunningham, Deirdre; Jarra, William; Lawton, Jennifer; Langhorne, Jean; Thompson, Joanne
2011-04-01
The rodent malaria parasite Plasmodium chabaudi chabaudi shares many features with human malaria species, including P. falciparum, and is the in vivo model of choice for many aspects of malaria research in the mammalian host, from sequestration of parasitized erythrocytes, to antigenic variation and host immunity and immunopathology. This protocol describes an optimized method for the transformation of mature blood-stage P.c. chabaudi and a description of a vector that targets efficient, single crossover integration into the P.c. chabaudi genome. Transformed lines are reproducibly generated and selected within 14-20 d, and show stable long-term protein expression even in the absence of drug selection. This protocol, therefore, provides the scientific community with a robust and reproducible method to generate transformed P.c. chabaudi parasites expressing fluorescent, bioluminescent and model antigens that can be used in vivo to dissect many of the fundamental principles of malaria infection.
Transformation of the rodent malaria parasite Plasmodium chabaudi
Spence, Philip J; Cunningham, Deirdre; Jarra, William; Lawton, Jennifer
2014-01-01
The rodent malaria parasite Plasmodium chabaudi chabaudi shares many features with human malaria species, including P. falciparum, and is the in vivo model of choice for many aspects of malaria research in the mammalian host, from sequestration of parasitized erythrocytes, to antigenic variation and host immunity and immunopathology. this protocol describes an optimized method for the transformation of mature blood-stage P.c. chabaudi and a description of a vector that targets efficient, single crossover integration into the P.c. chabaudi genome. Transformed lines are reproducibly generated and selected within 4–20 d, and show stable long-term protein expression even in the absence of drug selection. this protocol, therefore, provides the scientific community with a robust and reproducible method to generate transformed P.c. chabaudi parasites expressing fluorescent, bioluminescent and model antigens that can be used in vivo to dissect many of the fundamental principles of malaria infection. PMID:21455190
Highly Efficient Agrobacterium-Mediated Transformation of Wheat Via In Planta Inoculation
NASA Astrophysics Data System (ADS)
Risacher, Thierry; Craze, Melanie; Bowden, Sarah; Paul, Wyatt; Barsby, Tina
This chapter details a reproducible method for the transformation of spring wheat using Agrobacterium tumefaciens via the direct inoculation of bacteria into immature seeds in planta as described in patent WO 00/63398(1. Transformation efficiencies from 1 to 30% have been obtained and average efficiencies of at least 5% are routinely achieved. Regenerated plants are phenotypically normal with 30-50% of transformation events carrying introduced genes at single insertion sites, a higher rate than is typically reported for transgenic plants produced using biolistic transformation methods.
Adachi, Takumi; Sahara, Takehiko; Okuyama, Hidetoshi; Morita, Naoki
2017-07-01
Here, we describe a new method for genetic transformation of thraustochytrids, well-known producers of polyunsaturated fatty acids (PUFAs) like docosahexaenoic acid, by combining mild glass (zirconia) bead treatment and electroporation. Because the cell wall is a barrier against transfer of exogenous DNA into cells, gentle vortexing of cells with glass beads was performed prior to electroporation for partial cell wall disruption. G418-resistant transformants of thraustochytrid cells (Aurantiochytrium limacinum strain SR21 and thraustochytrid strain 12B) were successfully obtained with good reproducibility. The method reported here is simpler than methods using enzymes to generate spheroplasts and may provide advantages for PUFA production by using genetically modified thraustochytrids.
Producing gapped-ferrite transformer cores
NASA Technical Reports Server (NTRS)
Mclyman, W. T.
1980-01-01
Improved manufacturing techniques make reproducible gaps and minimize cracking. Molded, unfired transformer cores are cut with thin saw and then fired. Hardened semicircular core sections are bonded together, placed in aluminum core box, and fluidized-coated. After winding is run over box, core is potted. Economical method significantly reduces number of rejects.
Pepper, chili (Capsicum annuum).
Min, Jung; Shin, Sun Hee; Jeon, En Mi; Park, Jung Mi; Hyun, Ji Young; Harn, Chee Hark
2015-01-01
Pepper is a recalcitrant plant for Agrobacterium-mediated genetic transformation. Several obstacles to genetic transformation remain such as extremely low transformation rates; the choice of correct genotype is critical; and there is a high frequency of false positives due to direct shoot formation. Here, we report a useful protocol with a suitable selection method. The most important aspect of the pepper transformation protocol is selecting shoots growing from the callus, which is referred to as callus-mediated shoot formation. This protocol is a reproducible and reliable system for pepper transformation.
NASA Astrophysics Data System (ADS)
Yu, Shanshan; Murakami, Yuri; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2006-09-01
The article proposes a multispectral image compression scheme using nonlinear spectral transform for better colorimetric and spectral reproducibility. In the method, we show the reduction of colorimetric error under a defined viewing illuminant and also that spectral accuracy can be improved simultaneously using a nonlinear spectral transform called Labplus, which takes into account the nonlinearity of human color vision. Moreover, we show that the addition of diagonal matrices to Labplus can further preserve the spectral accuracy and has a generalized effect of improving the colorimetric accuracy under other viewing illuminants than the defined one. Finally, we discuss the usage of the first-order Markov model to form the analysis vectors for the higher order channels in Labplus to reduce the computational complexity. We implement a multispectral image compression system that integrates Labplus with JPEG2000 for high colorimetric and spectral reproducibility. Experimental results for a 16-band multispectral image show the effectiveness of the proposed scheme.
Minor Distortions with Major Consequences: Correcting Distortions in Imaging Spectrographs
Esmonde-White, Francis W. L.; Esmonde-White, Karen A.; Morris, Michael D.
2010-01-01
Projective transformation is a mathematical correction (implemented in software) used in the remote imaging field to produce distortion-free images. We present the application of projective transformation to correct minor alignment and astigmatism distortions that are inherent in dispersive spectrographs. Patterned white-light images and neon emission spectra were used to produce registration points for the transformation. Raman transects collected on microscopy and fiber-optic systems were corrected using established methods and compared with the same transects corrected using the projective transformation. Even minor distortions have a significant effect on reproducibility and apparent fluorescence background complexity. Simulated Raman spectra were used to optimize the projective transformation algorithm. We demonstrate that the projective transformation reduced the apparent fluorescent background complexity and improved reproducibility of measured parameters of Raman spectra. Distortion correction using a projective transformation provides a major advantage in reducing the background fluorescence complexity even in instrumentation where slit-image distortions and camera rotation were minimized using manual or mechanical means. We expect these advantages should be readily applicable to other spectroscopic modalities using dispersive imaging spectrographs. PMID:21211158
Inverse solution of ear-canal area function from reflectance
Rasetshwane, Daniel M.; Neely, Stephen T.
2011-01-01
A number of acoustical applications require the transformation of acoustical quantities, such as impedance and pressure that are measured at the entrance of the ear canal, to quantities at the eardrum. This transformation often requires knowledge of the shape of the ear canal. Previous attempts to measure ear-canal area functions were either invasive, non-reproducible, or could only measure the area function up to a point mid-way along the canal. A method to determine the area function of the ear canal from measurements of acoustic impedance at the entrance of the ear canal is described. The method is based on a solution to the inverse problem in which measurements of impedance are used to calculate reflectance, which is then used to determine the area function of the canal. The mean ear-canal area function determined using this method is similar to mean ear-canal area functions measured by other researchers using different techniques. The advantage of the proposed method over previous methods is that it is non- invasive, fast, and reproducible. PMID:22225043
Orchids (Cymbidium spp., Oncidium, and Phalaenopsis).
Chan, Ming-Tsair; Chan, Yuan-Li; Sanjaya
2006-01-01
Recent advances in genetic engineering have made the transformation and regeneration of plants into a powerful tool for orchid improvement. This chapter presents a simple and reproducible Agrobacterium tumefaciens-mediated transformation protocol and molecular screening technique of transgenics for two orchid species, Oncidium and Phalaenopsis. The target tissues for gene transfer were protocorm-like bodies (PLBs) derived from protocorms, into which constructed foreign genes were successfully introduced. To establish stable transformants, two stages of selection were applied on the PLBs co-cultivated with A. tumefaciens. About 10% transformation efficiency was achieved in Oncidium orchid, as 108 antibiotic resistant independent PLBs were proliferated from 1000 infected PLBs. In Phalaenopsis orchid about 11 to 12% of transformation efficiency was achieved by using the present protocol. Different molecular methods and GUS-staining used to screen putative transgenic plants to confirm the integration of foreign DNA into the orchid genome were also described in detail. The methods described would also be useful for transformation of desired genes into other orchid species.
Null hypersurface quantization, electromagnetic duality and asympotic symmetries of Maxwell theory
NASA Astrophysics Data System (ADS)
Bhattacharyya, Arpan; Hung, Ling-Yan; Jiang, Yikun
2018-03-01
In this paper we consider introducing careful regularization at the quantization of Maxwell theory in the asymptotic null infinity. This allows systematic discussions of the commutators in various boundary conditions, and application of Dirac brackets accordingly in a controlled manner. This method is most useful when we consider asymptotic charges that are not localized at the boundary u → ±∞ like large gauge transformations. We show that our method reproduces the operator algebra in known cases, and it can be applied to other space-time symmetry charges such as the BMS transformations. We also obtain the asymptotic form of the U(1) charge following from the electromagnetic duality in an explicitly EM symmetric Schwarz-Sen type action. Using our regularization method, we demonstrate that the charge generates the expected transformation of a helicity operator. Our method promises applications in more generic theories.
Illusions and Cloaks for Surface Waves
McManus, T. M.; Valiente-Kroon, J. A.; Horsley, S. A. R.; Hao, Y.
2014-01-01
Ever since the inception of Transformation Optics (TO), new and exciting ideas have been proposed in the field of electromagnetics and the theory has been modified to work in such fields as acoustics and thermodynamics. The most well-known application of this theory is to cloaking, but another equally intriguing application of TO is the idea of an illusion device. Here, we propose a general method to transform electromagnetic waves between two arbitrary surfaces. This allows a flat surface to reproduce the scattering behaviour of a curved surface and vice versa, thereby giving rise to perfect optical illusion and cloaking devices, respectively. The performance of the proposed devices is simulated using thin effective media with engineered material properties. The scattering of the curved surface is shown to be reproduced by its flat analogue (for illusions) and vice versa for cloaks. PMID:25145953
New Zealand and Queensland Teachers' Conceptions of Learning: Transforming More than Reproducing
ERIC Educational Resources Information Center
Brown, Gavin T. L.; Lake, Robert; Matters, Gabrielle
2008-01-01
Background: Two major conceptions of learning exist: reproducing new material and transforming material to make meaning. Teachers' understandings of what learning is probably influence their teaching practices and student academic performance. Aims: To validate a short scale derived from Tait, Entwistle, & McCune's (1998) ASSIST inventory and…
NASA Astrophysics Data System (ADS)
Saito, Asaki; Yasutomi, Shin-ichi; Tamura, Jun-ichi; Ito, Shunji
2015-06-01
We introduce a true orbit generation method enabling exact simulations of dynamical systems defined by arbitrary-dimensional piecewise linear fractional maps, including piecewise linear maps, with rational coefficients. This method can generate sufficiently long true orbits which reproduce typical behaviors (inherent behaviors) of these systems, by properly selecting algebraic numbers in accordance with the dimension of the target system, and involving only integer arithmetic. By applying our method to three dynamical systems—that is, the baker's transformation, the map associated with a modified Jacobi-Perron algorithm, and an open flow system—we demonstrate that it can reproduce their typical behaviors that have been very difficult to reproduce with conventional simulation methods. In particular, for the first two maps, we show that we can generate true orbits displaying the same statistical properties as typical orbits, by estimating the marginal densities of their invariant measures. For the open flow system, we show that an obtained true orbit correctly converges to the stable period-1 orbit, which is inherently possessed by the system.
Saving and Reproduction of Human Motion Data by Using Haptic Devices with Different Configurations
NASA Astrophysics Data System (ADS)
Tsunashima, Noboru; Yokokura, Yuki; Katsura, Seiichiro
Recently, there has been increased focus on “haptic recording” development of a motion-copying system is an efficient method for the realization of haptic recording. Haptic recording involves saving and reproduction of human motion data on the basis of haptic information. To increase the number of applications of the motion-copying system in various fields, it is necessary to reproduce human motion data by using haptic devices with different configurations. In this study, a method for the above-mentioned haptic recording is developed. In this method, human motion data are saved and reproduced on the basis of work space information, which is obtained by coordinate transformation of motor space information. The validity of the proposed method is demonstrated by experiments. With the proposed method, saving and reproduction of human motion data by using various devices is achieved. Furthermore, it is also possible to use haptic recording in various fields.
Bangerter, A
2000-12-01
The social representation (SR) of conception was investigated using an adapted version of Bartlett's (1932) method of serial reproduction. A sample of 75 participants reproduced a text describing the conception process in 20 segregated chains of four reproductive generations. Changes in sentence structure and content were analysed. Results indicated that when the scientific representation of conception is apprehended by laypersons, two different processes take place. First, the abstract biological description of the process is progressively transformed into an anthropomorphic description centred on the sperm and ovum (personification). Second, stereotypical sex-role attributes are projected onto the sperm and ovum. Limitations of the method of serial reproduction are discussed, as well as its potential for modelling processes of cultural diffusion of knowledge.
Daniele, Gaëlle; Fieu, Maëva; Joachim, Sandrine; Bado-Nilles, Anne; Baudoin, Patrick; Turies, Cyril; Porcher, Jean-Marc; Andres, Sandrine; Vulliet, Emmanuelle
2016-06-01
Pharmaceuticals are emerging organic contaminants ubiquitously present in the environment due to incessant input into the aquatic compartment mainly resulting from incomplete removal in wastewater treatment plants. One of the major preoccupations concerning pharmaceuticals released into surface waters is their potential for bioaccumulation in biota, possibly leading to deleterious effects on ecosystems especially as they could affect a broad variety of organisms living in or depending on the aquatic environment. Thus, the development of accurate and sensitive methods is necessary to detect these compounds in aquatic ecosystems. Considering this need, this study deals with the analytical development of a methodology to quantify traces of diclofenac together with some of its biotic and abiotic transformation products in whole-body tissue of three-spined stickleback. A simple and reliable extraction method based on a modified QuEChERS extraction is implemented on 200 mg of fish. The detection and quantification of the ten target compounds are performed using liquid chromatography-tandem mass spectrometry. The whole process was successfully validated regarding linearity, recovery, repeatability, and reproducibility. The method limits of detection and quantification do not exceed 1 ng/g. To reproduce environmental conditions, we measured the concentration of DCF and its transformation products in three-spined sticklebacks after a 6-month exposure in mesocosms at several levels of DCF ranging from 0.05 to 4.1 μg/L. The phase I metabolite 4'-hydroxydiclofenac was detected in fish samples exposed at the highest DCF concentration. Graphical abstract Analysis of diclofenac and some of its transformation products in the three-spined stickleback, Gasterosteus aculeatus, by QuEChERS extraction followed by LC-MS/MS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomizawa, Shinya; Nozawa, Masato
2006-06-15
We study vacuum solutions of five-dimensional Einstein equations generated by the inverse scattering method. We reproduce the black ring solution which was found by Emparan and Reall by taking the Euclidean Levi-Civita metric plus one-dimensional flat space as a seed. This transformation consists of two successive processes; the first step is to perform the three-solitonic transformation of the Euclidean Levi-Civita metric with one-dimensional flat space as a seed. The resulting metric is the Euclidean C-metric with extra one-dimensional flat space. The second is to perform the two-solitonic transformation by taking it as a new seed. Our result may serve asmore » a stepping stone to find new exact solutions in higher dimensions.« less
Transgenic watermelon rootstock resistant to CGMMV (cucumber green mottle mosaic virus) infection.
Park, Sang Mi; Lee, Jung Suk; Jegal, Sung; Jeon, Bo Young; Jung, Min; Park, Yoon Sik; Han, Sang Lyul; Shin, Yoon Sup; Her, Nam Han; Lee, Jang Ha; Lee, Mi Yeon; Ryu, Ki Hyun; Yang, Seung Gyun; Harn, Chee Hark
2005-08-01
In watermelon, grafting of seedlings to rootstocks is necessary because watermelon roots are less viable than the rootstock. Moreover, commercially important watermelon varieties require disease-resistant rootstocks to reduce total watermelon yield losses due to infection with viruses such as cucumber green mottle mosaic virus (CGMMV). Therefore, we undertook to develop a CGMMV-resistant watermelon rootstock using a cDNA encoding the CGMMV coat protein gene (CGMMV-CP), and successfully transformed a watermelon rootstock named 'gongdae'. The transformation rate was as low as 0.1-0.3%, depending on the transformation method used (ordinary co-culture vs injection, respectively). However, watermelon transformation was reproducibly and reliably achieved using these two methods. Southern blot analysis confirmed that the CGMMV-CP gene was inserted into different locations in the genome either singly or multiple copies. Resistance testing against CGMMV showed that 10 plants among 140 T1 plants were resistant to CGMMV infection. This is the first report of the development by genetic engineering of watermelons resistant to CGMMV infection.
Parameters of Models of Structural Transformations in Alloy Steel Under Welding Thermal Cycle
NASA Astrophysics Data System (ADS)
Kurkin, A. S.; Makarov, E. L.; Kurkin, A. B.; Rubtsov, D. E.; Rubtsov, M. E.
2017-05-01
A mathematical model of structural transformations in an alloy steel under the thermal cycle of multipass welding is suggested for computer implementation. The minimum necessary set of parameters for describing the transformations under heating and cooling is determined. Ferritic-pearlitic, bainitic and martensitic transformations under cooling of a steel are considered. A method for deriving the necessary temperature and time parameters of the model from the chemical composition of the steel is described. Published data are used to derive regression models of the temperature ranges and parameters of transformation kinetics in alloy steels. It is shown that the disadvantages of the active visual methods of analysis of the final phase composition of steels are responsible for inaccuracy and mismatch of published data. The hardness of a specimen, which correlates with some other mechanical properties of the material, is chosen as the most objective and reproducible criterion of the final phase composition. The models developed are checked by a comparative analysis of computational results and experimental data on the hardness of 140 alloy steels after cooling at various rates.
Color reproduction software for a digital still camera
NASA Astrophysics Data System (ADS)
Lee, Bong S.; Park, Du-Sik; Nam, Byung D.
1998-04-01
We have developed a color reproduction software for a digital still camera. The image taken by the camera was colorimetrically reproduced on the monitor after characterizing the camera and the monitor, and color matching between two devices. The reproduction was performed at three levels; level processing, gamma correction, and color transformation. The image contrast was increased after the level processing adjusting the level of dark and bright portions of the image. The relationship between the level processed digital values and the measured luminance values of test gray samples was calculated, and the gamma of the camera was obtained. The method for getting the unknown monitor gamma was proposed. As a result, the level processed values were adjusted by the look-up table created by the camera and the monitor gamma correction. For a color transformation matrix for the camera, 3 by 3 or 3 by 4 matrix was used, which was calculated by the regression between the gamma corrected values and the measured tristimulus values of each test color samples the various reproduced images were displayed on the dialogue box implemented in our software, which were generated according to four illuminations for the camera and three color temperatures for the monitor. An user can easily choose he best reproduced image comparing each others.
ERIC Educational Resources Information Center
Kwon, Soonjung; Walker, David Ian; Kristjánsson, Kristján
2018-01-01
The paper illustrates how a culture of violence is perpetuated and reproduced in South Korea through schooling and argues that peace education could help transform a culture of violence to a culture of peace. Critical ethnographic methods and a framework of peace education were applied to a sample of secondary schools in South Korea to argue that…
Reproducible surface-enhanced Raman quantification of biomarkers in multicomponent mixtures.
De Luca, Anna Chiara; Reader-Harris, Peter; Mazilu, Michael; Mariggiò, Stefania; Corda, Daniela; Di Falco, Andrea
2014-03-25
Direct and quantitative detection of unlabeled glycerophosphoinositol (GroPIns), an abundant cytosolic phosphoinositide derivative, would allow rapid evaluation of several malignant cell transformations. Here we report label-free analysis of GroPIns via surface-enhanced Raman spectroscopy (SERS) with a sensitivity of 200 nM, well below its apparent concentration in cells. Crucially, our SERS substrates, based on lithographically defined gold nanofeatures, can be used to predict accurately the GroPIns concentration even in multicomponent mixtures, avoiding the preliminary separation of individual compounds. Our results represent a critical step toward the creation of SERS-based biosensor for rapid, label-free, and reproducible detection of specific molecules, overcoming limits of current experimental methods.
Muniz, C R; da Silva, G F; Souza, M T; Freire, F C O; Kema, G H J; Guedes, M I F
2014-02-21
Lasiodiplodia theobromae is a major pathogen of many different crop cultures, including cashew nut plants. This paper describes an efficient Agrobacterium tumefaciens-mediated transformation (ATMT) system for the successful delivery of T-DNA, transferring the genes of green fluorescent protein (gfp) and hygromycin B phosphotransferase (hph) to L. theobromae. When the fungal pycnidiospores were co-cultured with A. tumefaciens harboring the binary vector with hph-gfp gene, hygromycin-resistant fungus only developed with acetosyringone supplementation. The cashew plants inoculated with the fungus expressing GFP revealed characteristic pathogen colonization by epifluorescence microscopy. Intense and bright green hyphae were observed for transformants in all extensions of mycelium cultures. The penetration of parenchyma cells near to the inoculation site, beneath the epicuticle surface, was observed prior to 25 dpi. Penetration was followed by the development of hyphae within invaded host cells. These findings provide a rapid and reproducible ATMT method for L. theobromae transformation.
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Bakshi, Souvika; Saha, Bedabrata; Roy, Nand Kishor; Mishra, Sagarika; Panda, Sanjib Kumar; Sahoo, Lingaraj
2012-06-01
A new method for obtaining transgenic cowpea was developed using positive selection based on the Escherichia coli 6-phosphomannose isomerase gene as the selectable marker and mannose as the selective agent. Only transformed cells were capable of utilizing mannose as a carbon source. Cotyledonary node explants from 4-day-old in vitro-germinated seedlings of cultivar Pusa Komal were inoculated with Agrobacterium tumefaciens strain EHA105 carrying the vector pNOV2819. Regenerating transformed shoots were selected on medium supplemented with a combination of 20 g/l mannose and 5 g/l sucrose as carbon source. The transformed shoots were rooted on medium devoid of mannose. Transformation efficiency based on PCR analysis of individual putative transformed shoots was 3.6%. Southern blot analysis on five randomly chosen PCR-positive plants confirmed the integration of the pmi transgene. Qualitative reverse transcription (qRT-PCR) analysis demonstrated the expression of pmi in T₀ transgenic plants. Chlorophenol red (CPR) assays confirmed the activity of PMI in transgenic plants, and the gene was transmitted to progeny in a Mendelian fashion. The transformation method presented here for cowpea using mannose selection is efficient and reproducible, and could be used to introduce a desirable gene(s) into cowpea for biotic and abiotic stress tolerance.
de Bakker, Chantal M. J.; Altman, Allison R.; Li, Connie; Tribble, Mary Beth; Lott, Carina; Tseng, Wei-Ju; Liu, X. Sherry
2016-01-01
In vivo μCT imaging allows for high-resolution, longitudinal evaluation of bone properties. Based on this technology, several recent studies have developed in vivo dynamic bone histomorphometry techniques that utilize registered μCT images to identify regions of bone formation and resorption, allowing for longitudinal assessment of bone remodeling. However, this analysis requires a direct voxel-by-voxel subtraction between image pairs, necessitating rotation of the images into the same coordinate system, which introduces interpolation errors. We developed a novel image transformation scheme, matched-angle transformation (MAT), whereby the interpolation errors are minimized by equally rotating both the follow-up and baseline images instead of the standard of rotating one image while the other remains fixed. This new method greatly reduced interpolation biases caused by the standard transformation. Additionally, our study evaluated the reproducibility and precision of bone remodeling measurements made via in vivo dynamic bone histomorphometry. Although bone remodeling measurements showed moderate baseline noise, precision was adequate to measure physiologically relevant changes in bone remodeling, and measurements had relatively good reproducibility, with intra-class correlation coefficients of 0.75-0.95. This indicates that, when used in conjunction with MAT, in vivo dynamic histomorphometry provides a reliable assessment of bone remodeling. PMID:26786342
de Bakker, Chantal M J; Altman, Allison R; Li, Connie; Tribble, Mary Beth; Lott, Carina; Tseng, Wei-Ju; Liu, X Sherry
2016-08-01
In vivo µCT imaging allows for high-resolution, longitudinal evaluation of bone properties. Based on this technology, several recent studies have developed in vivo dynamic bone histomorphometry techniques that utilize registered µCT images to identify regions of bone formation and resorption, allowing for longitudinal assessment of bone remodeling. However, this analysis requires a direct voxel-by-voxel subtraction between image pairs, necessitating rotation of the images into the same coordinate system, which introduces interpolation errors. We developed a novel image transformation scheme, matched-angle transformation (MAT), whereby the interpolation errors are minimized by equally rotating both the follow-up and baseline images instead of the standard of rotating one image while the other remains fixed. This new method greatly reduced interpolation biases caused by the standard transformation. Additionally, our study evaluated the reproducibility and precision of bone remodeling measurements made via in vivo dynamic bone histomorphometry. Although bone remodeling measurements showed moderate baseline noise, precision was adequate to measure physiologically relevant changes in bone remodeling, and measurements had relatively good reproducibility, with intra-class correlation coefficients of 0.75-0.95. This indicates that, when used in conjunction with MAT, in vivo dynamic histomorphometry provides a reliable assessment of bone remodeling.
NASA Astrophysics Data System (ADS)
Cecinati, F.; Wani, O.; Rico-Ramirez, M. A.
2016-12-01
It is widely recognised that merging radar rainfall estimates (RRE) with rain gauge data can improve the RRE and provide areal and temporal coverage that rain gauges cannot offer. Many methods to merge radar and rain gauge data are based on kriging and require an assumption of Gaussianity on the variable of interest. In particular, this work looks at kriging with external drift (KED), because it is an efficient, widely used, and well performing merging method. Rainfall, especially at finer temporal scale, does not have a normal distribution and presents a bi-modal skewed distribution. In some applications a Gaussianity assumption is made, without any correction. In other cases, variables are transformed in order to obtain a distribution closer to Gaussian. This work has two objectives: 1) compare different transformation methods in merging applications; 2) evaluate the uncertainty arising when untransformed rainfall data is used in KED. The comparison of transformation methods is addressed under two points of view. On the one hand, the ability to reproduce the original probability distribution after back-transformation of merged products is evaluated with qq-plots, on the other hand the rainfall estimates are compared with an independent set of rain gauge measurements. The tested methods are 1) no transformation, 2) Box-Cox transformations with parameter equal to λ=0.5 (square root), 3) λ=0.25 (square root - square root), and 4) λ=0.1 (almost logarithmic), 5) normal quantile transformation, and 6) singularity analysis. The uncertainty associated with the use of non-transformed data in KED is evaluated in comparison with the best performing product. The methods are tested on a case study in Northern England, using hourly data from 211 tipping bucket rain gauges from the Environment Agency and radar rainfall data at 1 km/5-min resolutions from the UK Met Office. In addition, 25 independent rain gauges from the UK Met Office were used to assess the merged products.
Karthik, Sivabalan; Pavan, Gadamchetty; Sathish, Selvam; Siva, Ramamoorthy; Kumar, Periyasamy Suresh; Manickavasagam, Markandan
2018-04-01
Agrobacterium infection and regeneration of the putatively transformed plant from the explant remains arduous for some crop species like peanut. Henceforth, a competent and reproducible in planta genetic transformation protocol is established for peanut cv. CO7 by standardizing various factors such as pre-culture duration, acetosyringone concentration, duration of co-cultivation, sonication and vacuum infiltration. In the present investigation, Agrobacterium tumefaciens strain EHA105 harboring the binary vector pCAMBIA1301- bar was used for transformation. The two-stage selection was carried out using 4 and 250 mg l -1 BASTA ® to completely eliminate the chimeric and non-transformed plants. The transgene integration into plant genome was evaluated by GUS histochemical assay, polymerase chain reaction (PCR), and Southern blot hybridization. Among the various combinations and concentrations analyzed, highest transformation efficiency was obtained when the 2-day pre-cultured explants were subjected to sonication for 6 min and vacuum infiltrated for 3 min in Agrobacterium suspension, and co-cultivated on MS medium supplemented with 150 µM acetosyringone for 3 days. The fidelity of the standardized in planta transformation method was assessed in five peanut cultivars and all the cultivars responded positively with a transformation efficiency ranging from minimum 31.3% (with cv. CO6) to maximum 38.6% (with cv. TMV7). The in planta transformation method optimized in this study could be beneficial to develop superior peanut cultivars with desirable genetic traits.
Back to Normal! Gaussianizing posterior distributions for cosmological probes
NASA Astrophysics Data System (ADS)
Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.
2014-05-01
We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.
Transformation of Epichloë typhina by electroporation of conidia
2011-01-01
Background Choke, caused by the endophytic fungus Epichloë typhina, is an important disease affecting orchardgrass (Dactylis glomerata L.) seed production in the Willamette Valley. Little is known concerning the conditions necessary for successful infection of orchardgrass by E. typhina. Detection of E. typhina in plants early in the disease cycle can be difficult due to the sparse distribution of hyphae in the plant. Therefore, a sensitive method to detect fungal infection in plants would provide an invaluable tool for elucidating the conditions for establishment of infection in orchardgrass. Utilization of a marker gene, such as the green fluorescent protein (GFP), transformed into Epichloë will facilitate characterization of the initial stages of infection and establishment of the fungus in plants. Findings We have developed a rapid, efficient, and reproducible transformation method using electroporation of germinating Epichloë conidia isolated from infected plants. Conclusions The GFP labelled E. typhina provides a valuable molecular tool to researchers studying conditions and mechanisms involved in the establishment of choke disease in orchardgrass. PMID:21375770
Water mass transformation along the Indonesian throughflow in an OGCM
NASA Astrophysics Data System (ADS)
Koch-Larrouy, Ariane; Madec, Gurvan; Blanke, Bruno; Molcard, Robert
2008-11-01
The oceanic pathways connecting the Pacific Ocean to the Indian Ocean are described using a quantitative Lagrangian method applied to Eulerian fields from an ocean general circulation model simulation of the Indonesian seas. The main routes diagnosed are in good agreement with those inferred from observations. The secondary routes and the Pacific recirculation are also quantified. The model reproduces the observed salt penetration of subtropical waters from the South Pacific, the homohaline stratification in the southern Indonesian basins, and the cold fresh tongue which exits into the Indian Ocean. These particular water mass characteristics, close to those observed, are obtained when a tidal mixing parameterization is introduced into the model. Trajectories are obtained which link the water masses at the entrance and at the exit of the Indonesian throughflow (ITF), and the mixing along each trajectory is quantified. Both the ITF and the Pacific recirculation are transformed, suggesting that the Indonesian transformation affects both the Indian and Pacific stratification. A recipe to form Indonesian water masses is proposed. We present three major features of the circulation that revisit the classical picture of the ITF and its associated water mass transformation, while still being in agreement with observations. Firstly, the homohaline layer is not a result of pure isopycnal mixing of the North Pacific Intermediate Water and South Pacific Subtropical Water (SPSW) within the Banda Sea, as previously thought. Instead, the observed homohaline layer is reproduced by the model, but it is caused by both isopycnal mixing with the SPSW and a dominant vertical mixing before the Banda Sea with the NPSW. This new mechanism could be real since the model reproduces the SPSW penetration as observed. Secondly, the model explains why the Banda Sea thermocline water is so fresh compared to the SPSW. Until now, the only explanation was a recirculation of the freshwater from the western route. The model does not reproduce this recirculation but instead shows strong mixing of the SPSW within the Halmahera and Seram Seas, which erodes the salinity maximum so that its signature is not longer perceptible. Finally, this work highlights the key role of the Java Sea freshwater. Even though its annual net mass contribution is small, its fresh salinity contribution is highly significant and represents the main reason why the Pacific salinity maxima are eroded.
Rubel, Oliver; Bowen, Benjamin P
2018-01-01
Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.
Bian, Wei; Li, Yan; Crane, Jason C; Nelson, Sarah J
2018-02-01
To implement a fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging (MRSI). The PRESS selected volume and outer-volume suppression bands were predefined on the MNI152 standard template image. The template image was aligned to the subject T 1 -weighted image during a scan, and the resulting transformation was then applied to the predefined prescription. To evaluate the method, H-1 MRSI data were obtained in repeat scan sessions from 20 healthy volunteers. In each session, datasets were acquired twice without repositioning. The overlap ratio of the prescribed volume in the two sessions was calculated and the reproducibility of inter- and intrasession metabolite peak height and area ratios was measured by the coefficient of variation (CoV). The CoVs from intra- and intersession were compared by a paired t-test. The average overlap ratio of the automatically prescribed selection volumes between two sessions was 97.8%. The average voxel-based intersession CoVs were less than 0.124 and 0.163 for peak height and area ratios, respectively. Paired t-test showed no significant difference between the intra- and intersession CoVs. The proposed method provides a time efficient method to prescribe 3D PRESS MRSI with reproducible imaging positioning and metabolite measurements. Magn Reson Med 79:636-642, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
DNA Integrity and Shock Wave Transformation Efficiency of Bacteria and Fungi
NASA Astrophysics Data System (ADS)
Loske, Achim M.; Campos-Guillén, Juan; Fernández, Francisco; Pastrana, Xóchitl; Magaña-Ortíz, Denis; Coconi-Linares, Nancy; Ortíz-Vázquez, Elizabeth; Gómez-Lim, Miguel
Delivery of DNA into bacteria and fungi is essential in medicine and biotechnology to produce metabolites, enzymes, antibiotics and proteins. So far, protocols to genetically transform bacteria and fungi are inefficient and have low reproducibility.
An International Ki67 Reproducibility Study
2013-01-01
Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without standardizing scoring methodology because analytical validity is limited. PMID:24203987
Toward End-to-End Face Recognition Through Alignment Learning
NASA Astrophysics Data System (ADS)
Zhong, Yuanyi; Chen, Jiansheng; Huang, Bo
2017-08-01
Plenty of effective methods have been proposed for face recognition during the past decade. Although these methods differ essentially in many aspects, a common practice of them is to specifically align the facial area based on the prior knowledge of human face structure before feature extraction. In most systems, the face alignment module is implemented independently. This has actually caused difficulties in the designing and training of end-to-end face recognition models. In this paper we study the possibility of alignment learning in end-to-end face recognition, in which neither prior knowledge on facial landmarks nor artificially defined geometric transformations are required. Specifically, spatial transformer layers are inserted in front of the feature extraction layers in a Convolutional Neural Network (CNN) for face recognition. Only human identity clues are used for driving the neural network to automatically learn the most suitable geometric transformation and the most appropriate facial area for the recognition task. To ensure reproducibility, our model is trained purely on the publicly available CASIA-WebFace dataset, and is tested on the Labeled Face in the Wild (LFW) dataset. We have achieved a verification accuracy of 99.08\\% which is comparable to state-of-the-art single model based methods.
Jaganath, Balusamy; Subramanyam, Kondeti; Mayavan, Subramanian; Karthik, Sivabalan; Elayaraja, Dhandapani; Udayakumar, Rajangam; Manickavasagam, Markandan; Ganapathi, Andy
2014-05-01
An efficient and reproducible Agrobacterium-mediated in planta transformation was developed in Jatropha curcas. The various factors affecting J. curcas in planta transformation were optimized, including decapitation, Agrobacterium strain, pin-pricking, vacuum infiltration duration and vacuum pressure. Simple vegetative in vivo cleft grafting method was adopted in the multiplication of transformants without the aid of tissue culture. Among the various parameters evaluated, decapitated plants on pin-pricking and vacuum infiltrated at 250 mmHg for 3 min with the Agrobacterium strain EHA 105 harbouring the binary vector pGA 492 was proved to be efficient in all terms with a transformation efficiency of 62.66%. Transgene integration was evinced by the GUS histochemical analysis, and the GUS positive plants were subjected to grafting. Putatively transformed J. curcas served as "Scion" and the wild type J. curcas plant severed as "Stock". There was no occurrence of graft rejection and the plants were then confirmed by GUS histochemical analysis, polymerase chain reaction (PCR) and Southern hybridization. Genetic stability of the grafted plants was evaluated by using randomly amplified polymorphic DNA (RAPD), marker which showed 100% genetic stability between mother and grafted plants. Thus, an efficient in planta transformation and grafting based multiplication of J. curcas was established.
In Vitro Transformation of Rat and Mouse Cells by DNA from Simian Virus 40
Abrahams, P. J.; van der Eb, A. J.
1975-01-01
Primary rat kidney cells and mouse 3T3 cells can be transformed by DNA of simian virus 40 when use is made of the calcium technique (Graham and van der Eb, 1973). The transformation assay in primary rat cells is reproducible, but the dose response is not linear. PMID:166204
2012-01-01
Mesenchymal stem cells change dramatically during culture expansion. Long-term culture has been suspected to evoke oncogenic transformation: overall, the genome appears to be relatively stable throughout culture but transient clonal aneuploidies have been observed. Oncogenic transformation does not necessarily entail growth advantage in vitro and, therefore, the available methods - such as karyotypic analysis or genomic profiling - cannot exclude this risk. On the other hand, long-term culture is associated with specific senescence-associated DNA methylation (SA-DNAm) changes, particularly in developmental genes. SA-DNAm changes are highly reproducible and can be used to monitor the state of senescence for quality control. Notably, neither telomere attrition nor SA-DNAm changes occur in pluripotent stem cells, which can evade the 'Hayflick limit'. Long-term culture of mesenchymal stem cells seems to involve a tightly regulated epigenetic program. These epigenetic modifications may counteract dominant clones, which are more prone to transformation. PMID:23257053
Wagner, Wolfgang
2012-12-20
Mesenchymal stem cells change dramatically during culture expansion. Long-term culture has been suspected to evoke oncogenic transformation: overall, the genome appears to be relatively stable throughout culture but transient clonal aneuploidies have been observed. Oncogenic transformation does not necessarily entail growth advantage in vitro and, therefore, the available methods - such as karyotypic analysis or genomic profiling - cannot exclude this risk. On the other hand, long-term culture is associated with specific senescence-associated DNA methylation (SA-DNAm) changes, particularly in developmental genes. SA-DNAm changes are highly reproducible and can be used to monitor the state of senescence for quality control. Notably, neither telomere attrition nor SA-DNAm changes occur in pluripotent stem cells, which can evade the 'Hayflick limit'. Long-term culture of mesenchymal stem cells seems to involve a tightly regulated epigenetic program. These epigenetic modifications may counteract dominant clones, which are more prone to transformation.
NASA Astrophysics Data System (ADS)
Xu, Dazhi; Cao, Jianshu
2016-08-01
The concept of polaron, emerged from condense matter physics, describes the dynamical interaction of moving particle with its surrounding bosonic modes. This concept has been developed into a useful method to treat open quantum systems with a complete range of system-bath coupling strength. Especially, the polaron transformation approach shows its validity in the intermediate coupling regime, in which the Redfield equation or Fermi's golden rule will fail. In the polaron frame, the equilibrium distribution carried out by perturbative expansion presents a deviation from the canonical distribution, which is beyond the usual weak coupling assumption in thermodynamics. A polaron transformed Redfield equation (PTRE) not only reproduces the dissipative quantum dynamics but also provides an accurate and efficient way to calculate the non-equilibrium steady states. Applications of the PTRE approach to problems such as exciton diffusion, heat transport and light-harvesting energy transfer are presented.
Press, William H.
2006-01-01
Götz, Druckmüller, and, independently, Brady have defined a discrete Radon transform (DRT) that sums an image's pixel values along a set of aptly chosen discrete lines, complete in slope and intercept. The transform is fast, O(N2log N) for an N × N image; it uses only addition, not multiplication or interpolation, and it admits a fast, exact algorithm for the adjoint operation, namely backprojection. This paper shows that the transform additionally has a fast, exact (although iterative) inverse. The inverse reproduces to machine accuracy the pixel-by-pixel values of the original image from its DRT, without artifacts or a finite point-spread function. Fourier or fast Fourier transform methods are not used. The inverse can also be calculated from sampled sinograms and is well conditioned in the presence of noise. Also introduced are generalizations of the DRT that combine pixel values along lines by operations other than addition. For example, there is a fast transform that calculates median values along all discrete lines and is able to detect linear features at low signal-to-noise ratios in the presence of pointlike clutter features of arbitrarily large amplitude. PMID:17159155
Press, William H
2006-12-19
Götz, Druckmüller, and, independently, Brady have defined a discrete Radon transform (DRT) that sums an image's pixel values along a set of aptly chosen discrete lines, complete in slope and intercept. The transform is fast, O(N2log N) for an N x N image; it uses only addition, not multiplication or interpolation, and it admits a fast, exact algorithm for the adjoint operation, namely backprojection. This paper shows that the transform additionally has a fast, exact (although iterative) inverse. The inverse reproduces to machine accuracy the pixel-by-pixel values of the original image from its DRT, without artifacts or a finite point-spread function. Fourier or fast Fourier transform methods are not used. The inverse can also be calculated from sampled sinograms and is well conditioned in the presence of noise. Also introduced are generalizations of the DRT that combine pixel values along lines by operations other than addition. For example, there is a fast transform that calculates median values along all discrete lines and is able to detect linear features at low signal-to-noise ratios in the presence of pointlike clutter features of arbitrarily large amplitude.
Govender, Nisha; Wong, Mui-Yun
2017-04-01
A highly efficient and reproducible Agrobacterium-mediated transformation protocol for Ganoderma boninense was developed to facilitate observation of the early stage infection of basal stem rot (BSR). The method was proven amenable to different explants (basidiospore, protoplast, and mycelium) of G. boninense. The transformation efficiency was highest (62%) under a treatment combination of protoplast explant and Agrobacterium strain LBA4404, with successful expression of an hyg marker gene and gus-gfp fusion gene under the control of heterologous p416 glyceraldehyde 3-phosphate dehydrogenase promoter. Optimal transformation conditions included a 1:100 Agrobacterium/explant ratio, induction of Agrobacterium virulence genes in the presence of 250 μm acetosyringone, co-cultivation at 22°C for 2 days on nitrocellulose membrane overlaid on an induction medium, and regeneration of transformants on potato glucose agar prepared with 0.6 M sucrose and 20 mM phosphate buffer. Evaluated transformants were able to infect root tissues of oil palm plantlets with needle-like microhyphae during the penetration event. The availability of this model pathogen system for BSR may lead to a better understanding of the pathogenicity factors associated with G. boninense penetration into oil palm roots.
NASA Astrophysics Data System (ADS)
Oyama, Takuro; Ikabata, Yasuhiro; Seino, Junji; Nakai, Hiromi
2017-07-01
This Letter proposes a density functional treatment based on the two-component relativistic scheme at the infinite-order Douglas-Kroll-Hess (IODKH) level. The exchange-correlation energy and potential are calculated using the electron density based on the picture-change corrected density operator transformed by the IODKH method. Numerical assessments indicated that the picture-change uncorrected density functional terms generate significant errors, on the order of hartree for heavy atoms. The present scheme was found to reproduce the energetics in the four-component treatment with high accuracy.
Mendelev, M. I.; Underwood, T. L.; Ackland, G. J.
2016-10-17
New interatomic potentials describing defects, plasticity, and high temperature phase transitions for Ti are presented. Fitting the martensitic hcp-bcc phase transformation temperature requires an efficient and accurate method to determine it. We apply a molecular dynamics method based on determination of the melting temperature of competing solid phases, and Gibbs-Helmholtz integration, and a lattice-switch Monte Carlo method: these agree on the hcp-bcc transformation temperatures to within 2 K. We were able to develop embedded atom potentials which give a good fit to either low or high temperature data, but not both. The first developed potential (Ti1) reproduces the hcp-bcc transformationmore » and melting temperatures and is suitable for the simulation of phase transitions and bcc Ti. Two other potentials (Ti2 and Ti3) correctly describe defect properties and can be used to simulate plasticity or radiation damage in hcp Ti. The fact that a single embedded atom method potential cannot describe both low and high temperature phases may be attributed to neglect of electronic degrees of freedom, notably bcc has a much higher electronic entropy. As a result, a temperature-dependent potential obtained from the combination of potentials Ti1 and Ti2 may be used to simulate Ti properties at any temperature.« less
Improvements to surrogate data methods for nonstationary time series.
Lucio, J H; Valdés, R; Rodríguez, L R
2012-05-01
The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.
Suzuki, Yuichi; Nagaoka, Masataka
2017-05-28
Atomistic information of a whole chemical reaction system, e.g., instantaneous microscopic molecular structures and orientations, offers important and deeper insight into clearly understanding unknown chemical phenomena. In accordance with the progress of a number of simultaneous chemical reactions, the Red Moon method (a hybrid Monte Carlo/molecular dynamics reaction method) is capable of simulating atomistically the chemical reaction process from an initial state to the final one of complex chemical reaction systems. In the present study, we have proposed a transformation theory to interpret the chemical reaction process of the Red Moon methodology as the time evolution process in harmony with the chemical kinetics. For the demonstration of the theory, we have chosen the gas reaction system in which the reversible second-order reaction H 2 + I 2 ⇌ 2HI occurs. First, the chemical reaction process was simulated from the initial configurational arrangement containing a number of H 2 and I 2 molecules, each at 300 K, 500 K, and 700 K. To reproduce the chemical equilibrium for the system, the collision frequencies for the reactions were taken into consideration in the theoretical treatment. As a result, the calculated equilibrium concentrations [H 2 ] eq and equilibrium constants K eq at all the temperatures were in good agreement with their corresponding experimental values. Further, we applied the theoretical treatment for the time transformation to the system and have shown that the calculated half-life τ's of [H 2 ] reproduce very well the analytical ones at all the temperatures. It is, therefore, concluded that the application of the present theoretical treatment with the Red Moon method makes it possible to analyze reasonably the time evolution of complex chemical reaction systems to chemical equilibrium at the atomistic level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Zhen; Voth, Gregory A., E-mail: gavoth@uchicago.edu
It is essential to be able to systematically construct coarse-grained (CG) models that can efficiently and accurately reproduce key properties of higher-resolution models such as all-atom. To fulfill this goal, a mapping operator is needed to transform the higher-resolution configuration to a CG configuration. Certain mapping operators, however, may lose information related to the underlying electrostatic properties. In this paper, a new mapping operator based on the centers of charge of CG sites is proposed to address this issue. Four example systems are chosen to demonstrate this concept. Within the multiscale coarse-graining framework, CG models that use this mapping operatormore » are found to better reproduce the structural correlations of atomistic models. The present work also demonstrates the flexibility of the mapping operator and the robustness of the force matching method. For instance, important functional groups can be isolated and emphasized in the CG model.« less
Waterlike glass polyamorphism in a monoatomic isotropic Jagla model.
Xu, Limei; Giovambattista, Nicolas; Buldyrev, Sergey V; Debenedetti, Pablo G; Stanley, H Eugene
2011-02-14
We perform discrete-event molecular dynamics simulations of a system of particles interacting with a spherically-symmetric (isotropic) two-scale Jagla pair potential characterized by a hard inner core, a linear repulsion at intermediate separations, and a weak attractive interaction at larger separations. This model system has been extensively studied due to its ability to reproduce many thermodynamic, dynamic, and structural anomalies of liquid water. The model is also interesting because: (i) it is very simple, being composed of isotropically interacting particles, (ii) it exhibits polyamorphism in the liquid phase, and (iii) its slow crystallization kinetics facilitate the study of glassy states. There is interest in the degree to which the known polyamorphism in glassy water may have parallels in liquid water. Motivated by parallels between the properties of the Jagla potential and those of water in the liquid state, we study the metastable phase diagram in the glass state. Specifically, we perform the computational analog of the protocols followed in the experimental studies of glassy water. We find that the Jagla potential calculations reproduce three key experimental features of glassy water: (i) the crystal-to-high-density amorphous solid (HDA) transformation upon isothermal compression, (ii) the low-density amorphous solid (LDA)-to-HDA transformation upon isothermal compression, and (iii) the HDA-to-very-high-density amorphous solid (VHDA) transformation upon isobaric annealing at high pressure. In addition, the HDA-to-LDA transformation upon isobaric heating, observed in water experiments, can only be reproduced in the Jagla model if a free surface is introduced in the simulation box. The HDA configurations obtained in cases (i) and (ii) are structurally indistinguishable, suggesting that both processes result in the same glass. With the present parametrization, the evolution of density with pressure or temperature is remarkably similar to the corresponding experimental measurements on water. Our simulations also suggest that the Jagla potential may reproduce features of the HDA-VHDA transformations observed in glassy water upon compression and decompression. Snapshots of the system during the HDA-VHDA and HDA-LDA transformations reveal a clear segregation between LDA and HDA but not between HDA and VHDA, consistent with the possibility that LDA and HDA are separated by a first order transformation as found experimentally, whereas HDA and VHDA are not. Our results demonstrate that a system of particles with simple isotropic pair interactions, a Jagla potential with two characteristic length scales, can present polyamorphism in the glass state as well as reproducing many of the distinguishing properties of liquid water. While most isotropic pair potential models crystallize readily on simulation time scales at the low temperatures investigated here, the Jagla potential is an exception, and is therefore a promising model system for the study of glass phenomenology.
Simplified method for the calculation of irregular waves in the coastal zone
NASA Astrophysics Data System (ADS)
Leont'ev, I. O.
2011-04-01
A method applicable for the estimation of the wave parameters along a set bottom profile is suggested. It takes into account the principal processes having an influence on the waves in the coastal zone: the transformation, refraction, bottom friction, and breaking. The ability to use a constant mean value of the friction coefficient under conditions of sandy shores is implied. The wave breaking is interpreted from the viewpoint of the concept of the limiting wave height at a given depth. The mean and root-mean-square wave heights are determined by the height distribution function, which transforms under the effect of the breaking. The verification of the method on the basis of the natural data shows that the calculation results reproduce the observed variations of the wave heights in a wide range of conditions, including profiles with underwater bars. The deviations from the calculated values mostly do not exceed 25%, and the mean square error is 11%. The method does not require a preliminary setting and can be implemented in the form of a relatively simple calculator accessible even for an inexperienced user.
Off-line real-time FTIR analysis of a process step in imipenem production
NASA Astrophysics Data System (ADS)
Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.
1992-08-01
We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.
Multiple pathways in pressure-induced phase transition of coesite
NASA Astrophysics Data System (ADS)
Liu, Wei; Wu, Xuebang; Liang, Yunfeng; Liu, Changsong; Miranda, Caetano R.; Scandolo, Sandro
2017-12-01
High-pressure single-crystal X-ray diffraction method with precise control of hydrostatic conditions, typically with helium or neon as the pressure-transmitting medium, has significantly changed our view on what happens with low-density silica phases under pressure. Coesite is a prototype material for pressure-induced amorphization. However, it was found to transform into a high-pressure octahedral (HPO) phase, or coesite-II and coesite-III. Given that the pressure is believed to be hydrostatic in two recent experiments, the different transformation pathways are striking. Based on molecular dynamic simulations with an ab initio parameterized potential, we reproduced all of the above experiments in three transformation pathways, including the one leading to an HPO phase. This octahedral phase has an oxygen hcp sublattice featuring 2 × 2 zigzag octahedral edge-sharing chains, however with some broken points (i.e., point defects). It transforms into α-PbO2 phase when it is relaxed under further compression. We show that the HPO phase forms through a continuous rearrangement of the oxygen sublattice toward hcp arrangement. The high-pressure amorphous phases can be described by an fcc and hcp sublattice mixture.
A kernel adaptive algorithm for quaternion-valued inputs.
Paul, Thomas K; Ogunfunmi, Tokunbo
2015-10-01
The use of quaternion data can provide benefit in applications like robotics and image recognition, and particularly for performing transforms in 3-D space. Here, we describe a kernel adaptive algorithm for quaternions. A least mean square (LMS)-based method was used, resulting in the derivation of the quaternion kernel LMS (Quat-KLMS) algorithm. Deriving this algorithm required describing the idea of a quaternion reproducing kernel Hilbert space (RKHS), as well as kernel functions suitable with quaternions. A modified HR calculus for Hilbert spaces was used to find the gradient of cost functions defined on a quaternion RKHS. In addition, the use of widely linear (or augmented) filtering is proposed to improve performance. The benefit of the Quat-KLMS and widely linear forms in learning nonlinear transformations of quaternion data are illustrated with simulations.
DNA as Genetic Material: Revisiting Classic Experiments through a Simple, Practical Class
ERIC Educational Resources Information Center
Malago, Wilson, Jr.; Soares-Costa, Andrea; Henrique-Silva, Flavio
2009-01-01
In 1928, Frederick Griffith demonstrated a transmission process of genetic information by transforming "Pneumococcus". In 1944, Avery et al. demonstrated that Griffith's transforming principle was DNA. We revisited these classic experiments in a practical class for undergraduate students. Both experiments were reproduced in simple, adapted forms.…
Gaussianization for fast and accurate inference from cosmological data
NASA Astrophysics Data System (ADS)
Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.
2016-06-01
We present a method to transform multivariate unimodal non-Gaussian posterior probability densities into approximately Gaussian ones via non-linear mappings, such as Box-Cox transformations and generalizations thereof. This permits an analytical reconstruction of the posterior from a point sample, like a Markov chain, and simplifies the subsequent joint analysis with other experiments. This way, a multivariate posterior density can be reported efficiently, by compressing the information contained in Markov Chain Monte Carlo samples. Further, the model evidence integral (I.e. the marginal likelihood) can be computed analytically. This method is analogous to the search for normal parameters in the cosmic microwave background, but is more general. The search for the optimally Gaussianizing transformation is performed computationally through a maximum-likelihood formalism; its quality can be judged by how well the credible regions of the posterior are reproduced. We demonstrate that our method outperforms kernel density estimates in this objective. Further, we select marginal posterior samples from Planck data with several distinct strongly non-Gaussian features, and verify the reproduction of the marginal contours. To demonstrate evidence computation, we Gaussianize the joint distribution of data from weak lensing and baryon acoustic oscillations, for different cosmological models, and find a preference for flat Λcold dark matter. Comparing to values computed with the Savage-Dickey density ratio, and Population Monte Carlo, we find good agreement of our method within the spread of the other two.
User-friendly freehand ultrasound calibration using Lego bricks and automatic registration.
Xiao, Yiming; Yan, Charles Xiao Bo; Drouin, Simon; De Nigris, Dante; Kochanowska, Anna; Collins, D Louis
2016-09-01
As an inexpensive, noninvasive, and portable clinical imaging modality, ultrasound (US) has been widely employed in many interventional procedures for monitoring potential tissue deformation, surgical tool placement, and locating surgical targets. The application requires the spatial mapping between 2D US images and 3D coordinates of the patient. Although positions of the devices (i.e., ultrasound transducer) and the patient can be easily recorded by a motion tracking system, the spatial relationship between the US image and the tracker attached to the US transducer needs to be estimated through an US calibration procedure. Previously, various calibration techniques have been proposed, where a spatial transformation is computed to match the coordinates of corresponding features in a physical phantom and those seen in the US scans. However, most of these methods are difficult to use for novel users. We proposed an ultrasound calibration method by constructing a phantom from simple Lego bricks and applying an automated multi-slice 2D-3D registration scheme without volumetric reconstruction. The method was validated for its calibration accuracy and reproducibility. Our method yields a calibration accuracy of [Formula: see text] mm and a calibration reproducibility of 1.29 mm. We have proposed a robust, inexpensive, and easy-to-use ultrasound calibration method.
A variant selection model for predicting the transformation texture of deformed austenite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butron-Guillen, M.P.; Jonas, J.J.; Da Costa Viana, C.S.
1997-09-01
The occurrence of variant selection during the transformation of deformed austenite is examined, together with its effect on the product texture. A new prediction method is proposed based on the morphology of the austenite grains, on slip activity, and on the residual stresses remaining in the material after rolling. The aspect ratio of pancaked grains is demonstrated to play an important role in favoring selection of the transformed copper ({l_brace}311{r_brace}<011> and {l_brace}211{r_brace}<011>) components. The extent of shear on active slip planes during prior rolling is shown to promote the formation of the transformed brass ({l_brace}332{r_brace}<113> and {l_brace}211{r_brace}<113>) components. Finally, themore » residual stresses remaining in the material after rolling play an essential part by preventing growth of the {l_brace}110{r_brace}<110> and {l_brace}100{r_brace} orientations selected by the grain shape and slip activity rules. With the aid of these three variant selection criteria combined, it is possible to reproduce all the features of the transformation textures observed experimentally. The criteria also explain why the intensities of the transformed copper components are sensitive to the pancaking strain, while those of the transformed brass are a function of the cooling rate employed after hot rolling.« less
NASA Astrophysics Data System (ADS)
de Vries, Diemer; Hörchens, Lars; Grond, Peter
2007-12-01
The state of the art of wave field synthesis (WFS) systems is that they can reproduce sound sources and secondary (mirror image) sources with natural spaciousness in a horizontal plane, and thus perform satisfactory 2D auralization of an enclosed space, based on multitrace impulse response data measured or simulated along a 2D microphone array. However, waves propagating with a nonzero elevation angle are also reproduced in the horizontal plane, which is neither physically nor perceptually correct. In most listening environments to be auralized, the floor is highly absorptive since it is covered with upholstered seats, occupied during performances by a well-dressed audience. A first-order ceiling reflection, reaching the floor directly or via a wall, will be severely damped and will not play a significant role in the room response anymore. This means that a spatially correct WFS reproduction of first-order ceiling reflections, by means of a loudspeaker array at the ceiling of the auralization reproduction room, is necessary and probably sufficient to create the desired 3D spatial perception. To determine the driving signals for the loudspeakers in the ceiling array, it is necessary to identify the relevant ceiling reflection(s) in the multichannel impulse response data and separate those events from the data set. Two methods are examined to identify, separate, and reproduce the relevant reflections: application of the Radon transform, and decomposition of the data into cylindrical harmonics. Application to synthesized and measured data shows that both methods in principle are able to identify, separate, and reproduce the relevant events.
[Study on the application of pyroelectric infrared sensor to safety protection system].
Wang, Song-de; Zhang, Shuan-ji; Zhu, Xiao-long; Yang, Jie-hui
2006-11-01
Using the infrared ray of human body, which is received and magnified by pyroelectric infrared sensor to form a certain voltage control signal, and using the control signal to trigger a voice recording-reproducing circuit, a pyroelectric infrared detector voice device with auto-control function designed. The circuit adopted new pyroelectric infrared detector assembly and voice recording-reproducing assembly. When someone is present in the detectable range of the pyroelectric infrared detector, first, the pyroelectric infrared sensor will transform the incepted radiation energy to a electric signal, which is then magnified and compared by an inside circuit, and an output control signal, touches off the voice recording-reproducing assembly with the reproducer sending out a beforehand transcribed caution voice to wise the man who does not know well the surrounding condition that the frontage is a danger zone and should not be approched. With the design of integrated structures, the distance-warning device has the advantages of strong anti-jamming ability, low temperature resistance, working stability and use-convenience, and it can be suitably installed and used in several locations which may endanger person safety, such as substation, high voltage switch panel, electric transformer, etc.
Flanking sequence determination and specific PCR identification of transgenic wheat B102-1-2.
Cao, Jijuan; Xu, Junyi; Zhao, Tongtong; Cao, Dongmei; Huang, Xin; Zhang, Piqiao; Luan, Fengxia
2014-01-01
The exogenous fragment sequence and flanking sequence between the exogenous fragment and recombinant chromosome of transgenic wheat B102-1-2 were successfully acquired using genome walking technology. The newly acquired exogenous fragment encoded the full-length sequence of transformed genes with transformed plasmid and corresponding functional genes including ubi, vector pBANF-bar, vector pUbiGUSPlus, vector HSP, reporter vector pUbiGUSPlus, promoter ubiquitin, and coli DH1. A specific polymerase chain reaction (PCR) identification method for transgenic wheat B102-1-2 was established on the basis of designed primers according to flanking sequence. This established specific PCR strategy was validated by using transgenic wheat, transgenic corn, transgenic soybean, transgenic rice, and non-transgenic wheat. A specifically amplified target band was observed only in transgenic wheat B102-1-2. Therefore, this method is characterized by high specificity, high reproducibility, rapid identification, and excellent accuracy for the identification of transgenic wheat B102-1-2.
Blood flow estimation in gastroscopic true-color images
NASA Astrophysics Data System (ADS)
Jacoby, Raffael S.; Herpers, Rainer; Zwiebel, Franz M.; Englmeier, Karl-Hans
1995-05-01
The assessment of blood flow in the gastrointestinal mucosa might be an important factor for the diagnosis and treatment of several diseases such as ulcers, gastritis, colitis, or early cancer. The quantity of blood flow is roughly estimated by computing the spatial hemoglobin distribution in the mucosa. The presented method enables a practical realization by calculating approximately the hemoglobin concentration based on a spectrophotometric analysis of endoscopic true-color images, which are recorded during routine examinations. A system model based on the reflectance spectroscopic law of Kubelka-Munk is derived which enables an estimation of the hemoglobin concentration by means of the color values of the images. Additionally, a transformation of the color values is developed in order to improve the luminance independence. Applying this transformation and estimating the hemoglobin concentration for each pixel of interest, the hemoglobin distribution can be computed. The obtained results are mostly independent of luminance. An initial validation of the presented method is performed by a quantitative estimation of the reproducibility.
Instrument-independent analysis of music by means of the continuous wavelet transform
NASA Astrophysics Data System (ADS)
Olmo, Gabriella; Dovis, Fabio; Benotto, Paolo; Calosso, Claudio; Passaro, Pierluigi
1999-10-01
This paper deals with the problem of automatic recognition of music. Segments of digitized music are processed by means of a Continuous Wavelet Transform, properly chosen so as to match the spectral characteristics of the signal. In order to achieve a good time-scale representation of the signal components a novel wavelet has been designed suited to the musical signal features. particular care has been devoted towards an efficient implementation, which operates in the frequency domain, and includes proper segmentation and aliasing reduction techniques to make the analysis of long signals feasible. The method achieves very good performance in terms of both time and frequency selectivity, and can yield the estimate and the localization in time of both the fundamental frequency and the main harmonics of each tone. The analysis is used as a preprocessing step for a recognition algorithm, which we show to be almost independent on the instrument reproducing the sounds. Simulations are provided to demonstrate the effectiveness of the proposed method.
Transforming networking within the ESIP Federation using ResearchBit
NASA Astrophysics Data System (ADS)
Robinson, E.
2015-12-01
Geoscientists increasingly need interdisciplinary teams to solve their research problems. Currently, geoscientists use Research Networking (RN) systems to connect with each other and find people of similar and dissimilar interests. As we shift to digitally mediated scholarship, we need innovative methods for scholarly communication. Formal methods for scholarly communication are undergoing vast transformation owing to the open-access movement and reproducible research. However, informal scholarly communication that takes place at professional society meetings and conferences, like AGU, has received limited research attention relying primarily on serendipitous interaction. The ResearchBit project aims to fundamentally improve informal methods of scholarly communication by leveraging the serendipitous interactions of researchers and making them more aware of co-located potential collaborators with mutual interests. This presentation will describe our preliminary hardware testing done at the Federation for Earth Science Information Partners (ESIP) Summer meeting this past July and the initial recommendation system design. The presentation will also cover the cultural shifts and hurdles to introducing new technology, the privacy concerns of tracking technology and how we are addressing those new issues.
Direct computational approach to lattice supersymmetric quantum mechanics
NASA Astrophysics Data System (ADS)
Kadoh, Daisuke; Nakayama, Katsumasa
2018-07-01
We study the lattice supersymmetric models numerically using the transfer matrix approach. This method consists only of deterministic processes and has no statistical uncertainties. We improve it by performing a scale transformation of variables such that the Witten index is correctly reproduced from the lattice model, and the other prescriptions are shown in detail. Compared to the precious Monte-Carlo results, we can estimate the effective masses, SUSY Ward identity and the cut-off dependence of the results in high precision. Those kinds of information are useful in improving lattice formulation of supersymmetric models.
Electromagnetic properties of thin-film transformer-coupled superconducting tunnel junctions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finnegan, T.F.; Lacquaniti, V.; Vaglio, R.
1981-09-01
Multisection superconducting microstrip transformers with designed output impedances below 0.1 ..cap omega.. have been fabricated via precise photolithographic techniques to investigate the electromagnetic properties of Nb-Nb oxide-Pb tunnel junctions. The low-impedance transformer sections incorporate a rf sputtered thin-film Ta-oxide dielectric, and the reproducible external coupling achievable with this type of geometry makes possible the systematic investigation of electromagnetic device parameters as a function of tunneling oxide thickness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smallwood, D.O.
It is recognized that some dynamic and noise environments are characterized by time histories which are not Gaussian. An example is high intensity acoustic noise. Another example is some transportation vibration. A better simulation of these environments can be generated if a zero mean non-Gaussian time history can be reproduced with a specified auto (or power) spectral density (ASD or PSD) and a specified probability density function (pdf). After the required time history is synthesized, the waveform can be used for simulation purposes. For example, modem waveform reproduction techniques can be used to reproduce the waveform on electrodynamic or electrohydraulicmore » shakers. Or the waveforms can be used in digital simulations. A method is presented for the generation of realizations of zero mean non-Gaussian random time histories with a specified ASD, and pdf. First a Gaussian time history with the specified auto (or power) spectral density (ASD) is generated. A monotonic nonlinear function relating the Gaussian waveform to the desired realization is then established based on the Cumulative Distribution Function (CDF) of the desired waveform and the known CDF of a Gaussian waveform. The established function is used to transform the Gaussian waveform to a realization of the desired waveform. Since the transformation preserves the zero-crossings and peaks of the original Gaussian waveform, and does not introduce any substantial discontinuities, the ASD is not substantially changed. Several methods are available to generate a realization of a Gaussian distributed waveform with a known ASD. The method of Smallwood and Paez (1993) is an example. However, the generation of random noise with a specified ASD but with a non-Gaussian distribution is less well known.« less
On the reach of perturbative methods for dark matter density fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldauf, Tobias; Zaldarriaga, Matias; Schaan, Emmanuel, E-mail: baldauf@ias.edu, E-mail: eschaan@astro.princeton.edu, E-mail: matiasz@ias.edu
We study the mapping from Lagrangian to Eulerian space in the context of the Effective Field Theory (EFT) of Large Scale Structure. We compute Lagrangian displacements with Lagrangian Perturbation Theory (LPT) and perform the full non-perturbative transformation from displacement to density. When expanded up to a given order, this transformation reproduces the standard Eulerian Perturbation Theory (SPT) at the same order. However, the full transformation from displacement to density also includes higher order terms. These terms explicitly resum long wavelength motions, thus making the resulting density field better correlated with the true non-linear density field. As a result, the regimemore » of validity of this approach is expected to extend that of the Eulerian EFT, and match that of the IR-resummed Eulerian EFT. This approach thus effectively enables a test of the IR-resummed EFT at the field level. We estimate the size of stochastic, non-perturbative contributions to the matter density power spectrum. We find that in our highest order calculation, at redshift z = 0 the power spectrum of the density field is reproduced with an accuracy of 1% (10%) up to k = 0.25 hMpc{sup −1} (k = 0.46 hMpc{sup −1}). We believe that the dominant source of the remaining error is the stochastic contribution. Unfortunately, on these scales the stochastic term does not yet scale as k{sup 4} as it does in the very low k regime. Thus, modeling this contribution might be challenging.« less
Flanking sequence determination and event-specific detection of genetically modified wheat B73-6-1.
Xu, Junyi; Cao, Jijuan; Cao, Dongmei; Zhao, Tongtong; Huang, Xin; Zhang, Piqiao; Luan, Fengxia
2013-05-01
In order to establish a specific identification method for genetically modified (GM) wheat, exogenous insert DNA and flanking sequence between exogenous fragment and recombinant chromosome of GM wheat B73-6-1 were successfully acquired by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. Newly acquired exogenous fragment covered the full-length sequence of transformed genes such as transformed plasmid and corresponding functional genes including marker uidA, herbicide-resistant bar, ubiquitin promoter, and high-molecular-weight gluten subunit. The flanking sequence between insert DNA revealed high similarity with Triticum turgidum A gene (GenBank: AY494981.1). A specific PCR detection method for GM wheat B73-6-1 was established on the basis of primers designed according to the flanking sequence. This specific PCR method was validated by GM wheat, GM corn, GM soybean, GM rice, and non-GM wheat. The specifically amplified target band was observed only in GM wheat B73-6-1. This method is of high specificity, high reproducibility, rapid identification, and excellent accuracy for the identification of GM wheat B73-6-1.
Transformation From a Conventional Clinical Microbiology Laboratory to Full Automation.
Moreno-Camacho, José L; Calva-Espinosa, Diana Y; Leal-Leyva, Yoseli Y; Elizalde-Olivas, Dolores C; Campos-Romero, Abraham; Alcántar-Fernández, Jonathan
2017-12-22
To validate the performance, reproducibility, and reliability of BD automated instruments in order to establish a fully automated clinical microbiology laboratory. We used control strains and clinical samples to assess the accuracy, reproducibility, and reliability of the BD Kiestra WCA, the BD Phoenix, and BD Bruker MALDI-Biotyper instruments and compared them to previously established conventional methods. The following processes were evaluated: sample inoculation and spreading, colony counts, sorting of cultures, antibiotic susceptibility test, and microbial identification. The BD Kiestra recovered single colonies in less time than conventional methods (e.g. E. coli, 7h vs 10h, respectively) and agreement between both methodologies was excellent for colony counts (κ=0.824) and sorting cultures (κ=0.821). Antibiotic susceptibility tests performed with BD Phoenix and disk diffusion demonstrated 96.3% agreement with both methods. Finally, we compared microbial identification in BD Phoenix and Bruker MALDI-Biotyper and observed perfect agreement (κ=1) and identification at a species level for control strains. Together these instruments allow us to process clinical urine samples in 36h (effective time). The BD automated technologies have improved performance compared with conventional methods, and are suitable for its implementation in very busy microbiology laboratories. © American Society for Clinical Pathology 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Yoon, Heechul; Lee, Hyuntaek; Jung, Haekyung; Lee, Mi-Young; Won, Hye-Sung
2015-03-01
The objective of the paper is to introduce a novel method for nuchal translucency (NT) boundary detection and thickness measurement, which is one of the most significant markers in the early screening of chromosomal defects, namely Down syndrome. To improve the reliability and reproducibility of NT measurements, several automated methods have been introduced. However, the performance of their methods degrades when NT borders are tilted due to varying fetal movements. Therefore, we propose a principal direction estimation based NT measurement method to provide reliable and consistent performance regardless of both fetal positions and NT directions. At first, Radon Transform and cost function are used to estimate the principal direction of NT borders. Then, on the estimated angle bin, i.e., the main direction of NT, gradient based features are employed to find initial NT lines which are beginning points of the active contour fitting method to find real NT borders. Finally, the maximum thickness is measured from distances between the upper and lower border of NT by searching along to the orthogonal lines of main NT direction. To evaluate the performance, 89 of in vivo fetal images were collected and the ground-truth database was measured by clinical experts. Quantitative results using intraclass correlation coefficients and difference analysis verify that the proposed method can improve the reliability and reproducibility in the measurement of maximum NT thickness.
ERIC Educational Resources Information Center
Schizas, Dimitrios; Papatheodorou, Efimia; Stamou, George
2018-01-01
This study conducts a textbook analysis in the frame of the following working hypothesis: The transformation of scientific knowledge into school knowledge is expected to reproduce the problems encountered with the scientific knowledge itself or generate additional problems, which may both induce misconceptions in textbook users. Specifically, we…
ITK: enabling reproducible research and open science
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387
ITK: enabling reproducible research and open science.
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.
Improving the Efficiency of Free Energy Calculations in the Amber Molecular Dynamics Package.
Kaus, Joseph W; Pierce, Levi T; Walker, Ross C; McCammont, J Andrew
2013-09-10
Alchemical transformations are widely used methods to calculate free energies. Amber has traditionally included support for alchemical transformations as part of the sander molecular dynamics (MD) engine. Here we describe the implementation of a more efficient approach to alchemical transformations in the Amber MD package. Specifically we have implemented this new approach within the more computational efficient and scalable pmemd MD engine that is included with the Amber MD package. The majority of the gain in efficiency comes from the improved design of the calculation, which includes better parallel scaling and reduction in the calculation of redundant terms. This new implementation is able to reproduce results from equivalent simulations run with the existing functionality, but at 2.5 times greater computational efficiency. This new implementation is also able to run softcore simulations at the λ end states making direct calculation of free energies more accurate, compared to the extrapolation required in the existing implementation. The updated alchemical transformation functionality will be included in the next major release of Amber (scheduled for release in Q1 2014) and will be available at http://ambermd.org, under the Amber license.
Multiple pathways in pressure-induced phase transition of coesite
Liu, Wei; Wu, Xuebang; Liu, Changsong; Miranda, Caetano R.; Scandolo, Sandro
2017-01-01
High-pressure single-crystal X-ray diffraction method with precise control of hydrostatic conditions, typically with helium or neon as the pressure-transmitting medium, has significantly changed our view on what happens with low-density silica phases under pressure. Coesite is a prototype material for pressure-induced amorphization. However, it was found to transform into a high-pressure octahedral (HPO) phase, or coesite-II and coesite-III. Given that the pressure is believed to be hydrostatic in two recent experiments, the different transformation pathways are striking. Based on molecular dynamic simulations with an ab initio parameterized potential, we reproduced all of the above experiments in three transformation pathways, including the one leading to an HPO phase. This octahedral phase has an oxygen hcp sublattice featuring 2 × 2 zigzag octahedral edge-sharing chains, however with some broken points (i.e., point defects). It transforms into α-PbO2 phase when it is relaxed under further compression. We show that the HPO phase forms through a continuous rearrangement of the oxygen sublattice toward hcp arrangement. The high-pressure amorphous phases can be described by an fcc and hcp sublattice mixture. PMID:29162690
Improving the Efficiency of Free Energy Calculations in the Amber Molecular Dynamics Package
Pierce, Levi T.; Walker, Ross C.; McCammont, J. Andrew
2013-01-01
Alchemical transformations are widely used methods to calculate free energies. Amber has traditionally included support for alchemical transformations as part of the sander molecular dynamics (MD) engine. Here we describe the implementation of a more efficient approach to alchemical transformations in the Amber MD package. Specifically we have implemented this new approach within the more computational efficient and scalable pmemd MD engine that is included with the Amber MD package. The majority of the gain in efficiency comes from the improved design of the calculation, which includes better parallel scaling and reduction in the calculation of redundant terms. This new implementation is able to reproduce results from equivalent simulations run with the existing functionality, but at 2.5 times greater computational efficiency. This new implementation is also able to run softcore simulations at the λ end states making direct calculation of free energies more accurate, compared to the extrapolation required in the existing implementation. The updated alchemical transformation functionality will be included in the next major release of Amber (scheduled for release in Q1 2014) and will be available at http://ambermd.org, under the Amber license. PMID:24185531
Statistical characteristics of surrogate data based on geophysical measurements
NASA Astrophysics Data System (ADS)
Venema, V.; Bachner, S.; Rust, H. W.; Simmer, C.
2006-09-01
In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.
Teacher as Learner: A Personal Reflection on a Short Course for South African University Educators
ERIC Educational Resources Information Center
Clowes, Lindsay
2013-01-01
Higher education is understood to play a critical role in ongoing processes of social transformation in post-apartheid South Africa through the production of graduates who are critical and engaged citizens. A key challenge is that institutions of higher education are themselves implicated in reproducing the very hierarchies they hope to transform.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahluwalia, D.V.; Sawicki, M.
Using the Weinberg-Soper formalism we construct the front-form ([ital j],0)[direct sum](0,[ital j]) spinors. Explicit expressions for the generalized Melosh transformations up to spin two are obtained. The formalism, without explicitly invoking any wave equations, reproduces the spin-1/2 front-form results of Melosh, Lepage and Brodsky, and Dziembowski.
Lee, Jungmin; Durst, Robert W; Wrolstad, Ronald E
2005-01-01
This collaborative study was conducted to determine the total monomeric anthocyanin concentration by the pH differential method, which is a rapid and simple spectrophotometric method based on the anthocyanin structural transformation that occurs with a change in pH (colored at pH 1.0 and colorless at pH 4.5). Eleven collaborators representing commercial laboratories, academic institutions, and government laboratories participated. Seven Youden pair materials representing fruit juices, beverages, natural colorants, and wines were tested. The repeatability relative standard deviation (RSDr) varied from 1.06 to 4.16%. The reproducibility relative standard deviation (RSDR) ranged from 2.69 to 10.12%. The HorRat values were < or = 1.33 for all materials. The Study Director recommends that the method be adopted Official First Action.
Dean, Kimberly M; Grayhack, Elizabeth J
2012-12-01
We have developed a robust and sensitive method, called RNA-ID, to screen for cis-regulatory sequences in RNA using fluorescence-activated cell sorting (FACS) of yeast cells bearing a reporter in which expression of both superfolder green fluorescent protein (GFP) and yeast codon-optimized mCherry red fluorescent protein (RFP) is driven by the bidirectional GAL1,10 promoter. This method recapitulates previously reported progressive inhibition of translation mediated by increasing numbers of CGA codon pairs, and restoration of expression by introduction of a tRNA with an anticodon that base pairs exactly with the CGA codon. This method also reproduces effects of paromomycin and context on stop codon read-through. Five key features of this method contribute to its effectiveness as a selection for regulatory sequences: The system exhibits greater than a 250-fold dynamic range, a quantitative and dose-dependent response to known inhibitory sequences, exquisite resolution that allows nearly complete physical separation of distinct populations, and a reproducible signal between different cells transformed with the identical reporter, all of which are coupled with simple methods involving ligation-independent cloning, to create large libraries. Moreover, we provide evidence that there are sequences within a 9-nt library that cause reduced GFP fluorescence, suggesting that there are novel cis-regulatory sequences to be found even in this short sequence space. This method is widely applicable to the study of both RNA-mediated and codon-mediated effects on expression.
NASA Astrophysics Data System (ADS)
Thubagere, Anupama J.; Thachuk, Chris; Berleant, Joseph; Johnson, Robert F.; Ardelean, Diana A.; Cherry, Kevin M.; Qian, Lulu
2017-02-01
Biochemical circuits made of rationally designed DNA molecules are proofs of concept for embedding control within complex molecular environments. They hold promise for transforming the current technologies in chemistry, biology, medicine and material science by introducing programmable and responsive behaviour to diverse molecular systems. As the transformative power of a technology depends on its accessibility, two main challenges are an automated design process and simple experimental procedures. Here we demonstrate the use of circuit design software, combined with the use of unpurified strands and simplified experimental procedures, for creating a complex DNA strand displacement circuit that consists of 78 distinct species. We develop a systematic procedure for overcoming the challenges involved in using unpurified DNA strands. We also develop a model that takes synthesis errors into consideration and semi-quantitatively reproduces the experimental data. Our methods now enable even novice researchers to successfully design and construct complex DNA strand displacement circuits.
Ohta, Daisaku; Kanaya, Shigehiko; Suzuki, Hideyuki
2010-02-01
Metabolomics, as an essential part of genomics studies, intends holistic understanding of metabolic networks through simultaneous analysis of a myriad of both known and unknown metabolites occurring in living organisms. The initial stage of metabolomics was designed for the reproducible analyses of known metabolites based on their comparison to available authentic compounds. Such metabolomics platforms were mostly based on mass spectrometry (MS) technologies enabled by a combination of different ionization methods together with a variety of separation steps including LC, GC, and CE. Among these, Fourier-transform ion cyclotron resonance MS (FT-ICR/MS) is distinguished from other MS technologies by its ultrahigh resolution power in mass to charge ratio (m/z). The potential of FT-ICR/MS as a distinctive metabolomics tool has been demonstrated in nontargeted metabolic profiling and functional characterization of novel genes. Here, we discuss both the advantages and difficulties encountered in the FT-ICR/MS metabolomics studies.
Linearly exact parallel closures for slab geometry
NASA Astrophysics Data System (ADS)
Ji, Jeong-Young; Held, Eric D.; Jhang, Hogun
2013-08-01
Parallel closures are obtained by solving a linearized kinetic equation with a model collision operator using the Fourier transform method. The closures expressed in wave number space are exact for time-dependent linear problems to within the limits of the model collision operator. In the adiabatic, collisionless limit, an inverse Fourier transform is performed to obtain integral (nonlocal) parallel closures in real space; parallel heat flow and viscosity closures for density, temperature, and flow velocity equations replace Braginskii's parallel closure relations, and parallel flow velocity and heat flow closures for density and temperature equations replace Spitzer's parallel transport relations. It is verified that the closures reproduce the exact linear response function of Hammett and Perkins [Phys. Rev. Lett. 64, 3019 (1990)] for Landau damping given a temperature gradient. In contrast to their approximate closures where the vanishing viscosity coefficient numerically gives an exact response, our closures relate the heat flow and nonvanishing viscosity to temperature and flow velocity (gradients).
Liu, Ying; Liu, Guoxuan; Yang, Yali; Niu, Sufang; Yang, Fuguang; Yang, Shaoxia; Tang, Jianian; Chen, Jianping
2017-12-01
An efficient and reproducible protocol is described for shoot-bud regeneration and Agrobacterium tumefaciens-mediated genetic transformation of J. curcas. Treating the explants with high concentrations (5-120 mg/L) of TDZ for short durations (5-80 min) before inoculation culture increased significantly the regeneration frequency and improved the quality of the regenerated buds. The highest shoot-buds induction rate (87.35%) was achieved when petiole explants were treated with 20 mg/L TDZ solution for 20 min and inoculated on hormone-free MS medium for 30 days. Regenerated shoots of 0.5 cm or a little longer were isolated and grafted to seedling stocks of the same species, and then the grafted plantlets were planted on half-strength MS medium containing 0.1 mg/L IBA and 2 mg/L sodium nitroprusside (SNP). This grafting strategy was found to be very effective, to obtain that healthy grafted plantlets ready for acclimatization within 20 days. By the above mentioned protocol and with general Agrobacterium - mediated genetic transformation methods only 65 days were needed to obtain intact transgenic plants.
An Efficient PEG/CaCl₂-Mediated Transformation Approach for the Medicinal Fungus Wolfiporia cocos.
Sun, Qiao; Wei, Wei; Zhao, Juan; Song, Jia; Peng, Fang; Zhang, Shaopeng; Zheng, Yonglian; Chen, Ping; Zhu, Wenjun
2015-09-01
Sclerotia of Wolfiporia cocos are of medicinal and culinary value. The genes and molecular mechanisms involved in W. cocos sclerotial formation are poorly investigated because of the lack of a suitable and reproducible transformation system for W. cocos. In this study, a PEG/ CaCl₂-mediated genetic transformation system for W. cocos was developed. The promoter Pgpd from Ganoderma lucidum effectively drove expression of the hygromycin B phosphotransferase gene in W. cocos, and approximately 30 transformants were obtained per 10 μg DNA when the protoplast suspension density was 10(6) protoplasts/ml. However, no transformants were obtained under the regulation of the PtrpC promoter from Aspergillus nidulans.
Optimal wavelet transform for the detection of microaneurysms in retina photographs.
Quellec, Gwénolé; Lamard, Mathieu; Josselin, Pierre Marie; Cazuguel, Guy; Cochener, Béatrice; Roux, Christian
2008-09-01
In this paper, we propose an automatic method to detect microaneurysms in retina photographs. Microaneurysms are the most frequent and usually the first lesions to appear as a consequence of diabetic retinopathy. So, their detection is necessary for both screening the pathology and follow up (progression measurement). Automating this task, which is currently performed manually, would bring more objectivity and reproducibility. We propose to detect them by locally matching a lesion template in subbands of wavelet transformed images. To improve the method performance, we have searched for the best adapted wavelet within the lifting scheme framework. The optimization process is based on a genetic algorithm followed by Powell's direction set descent. Results are evaluated on 120 retinal images analyzed by an expert and the optimal wavelet is compared to different conventional mother wavelets. These images are of three different modalities: there are color photographs, green filtered photographs, and angiographs. Depending on the imaging modality, microaneurysms were detected with a sensitivity of respectively 89.62%, 90.24%, and 93.74% and a positive predictive value of respectively 89.50%, 89.75%, and 91.67%, which is better than previously published methods.
Optimal wavelet transform for the detection of microaneurysms in retina photographs
Quellec, Gwénolé; Lamard, Mathieu; Josselin, Pierre Marie; Cazuguel, Guy; Cochener, Béatrice; Roux, Christian
2008-01-01
In this article, we propose an automatic method to detect microaneurysms in retina photographs. Microaneurysms are the most frequent and usually the first lesions to appear as a consequence of diabetic retinopathy. So, their detection is necessary for both screening the pathology and follow up (progression measurement). Automating this task, which is currently performed manually, would bring more objectivity and reproducibility. We propose to detect them by locally matching a lesion template in subbands of wavelet transformed images. To improve the method performance, we have searched for the best adapted wavelet within the lifting scheme framework. The optimization process is based on a genetic algorithm followed by Powell’s direction set descent. Results are evaluated on 120 retinal images analyzed by an expert and the optimal wavelet is compared to different conventional mother wavelets. These images are of three different modalites: there are color photographs, green filtered photographs and angiographs. Depending on the imaging modality, microaneurysms were detected with a sensitivity of respectively 89.62%, 90.24% and 93.74% and a positive predictive value of respectively 89.50%, 89.75% and 91.67%, which is better than previously published methods. PMID:18779064
2011-01-01
Background Following genome sequencing of crop plants, one of the main challenges today is determining the function of all the predicted genes. When gene validation approaches are used for woody species, the main obstacle is the low recovery rate of transgenic plants from elite or commercial cultivars. Embryogenic calli have frequently been the target tissue for transformation, but the difficulty in producing or maintaining embryogenic tissues is one of the main problems encountered in genetic transformation of many woody plants, including Coffea arabica. Results We identified the conditions required for successful long-term proliferation of embryogenic cultures in C. arabica and designed a highly efficient and reliable Agrobacterium tumefaciens-mediated transformation method based on these conditions. The transformation protocol with LBA1119 harboring pBin 35S GFP was established by evaluating the effect of different parameters on transformation efficiency by GFP detection. Using embryogenic callus cultures, co-cultivation with LBA1119 OD600 = 0.6 for five days at 20 °C enabled reproducible transformation. The maintenance conditions for the embryogenic callus cultures, particularly a high auxin to cytokinin ratio, the age of the culture (optimum for 7-10 months of proliferation) and the use of a yellow callus phenotype, were the most important factors for achieving highly efficient transformation (> 90%). At the histological level, successful transformation was related to the number of proembryogenic masses present. All the selected plants were proved to be transformed by PCR and Southern blot hybridization. Conclusion Most progress in increasing transformation efficiency in coffee has been achieved by optimizing the production conditions of embryogenic cultures used as target tissues for transformation. This is the first time that a strong positive effect of the age of the culture on transformation efficiency was demonstrated. Our results make Agrobacterium-mediated transformation of embryogenic cultures a viable and useful tool both for coffee breeding and for the functional analysis of agronomically important genes. PMID:21595964
Borade, P; Joshi, K U; Gokarna, A; Lerondel, G; Jejurikar, S M
2016-01-15
In this paper, we report the synthesis of dumbbell-shaped ZnO structures and their subsequent transformation into perfect hexagonal tubes by the extended chemical bath deposition (CBD) method, retaining all advantages such as reproducibility, simplicity, quickness and economical aspect. Well-dispersed sub-micron-sized dumbbell-shaped ZnO structures were synthesized on a SiO2/Si substrate by the CBD method. As an extension of the CBD process the synthesized ZnO dumbbells were exposed to the evaporate coming out of the chemical bath for a few minutes (simply by adjusting the height of the deposit so that it remained just above the solution) to convert them into hexagonal tubes via the dissolution process. The possible dissolution mechanism responsible for the observed conversion is discussed. The optical properties (photo-luminescence) recorded at low temperature on both the structures showed an intense, sharp excitonic peak located at ∼370 nm. The improved intensity and low FWHM of the UV peak observed in the hexagonal tubular structures assures high optical quality, and hence can be used for optoelectronic applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Puerari, Ivânio; Elmegreen, Bruce G.; Block, David L., E-mail: puerari@inaoep.mx
2014-12-01
We examine 8 μm IRAC images of the grand design two-arm spiral galaxies M81 and M51 using a new method whereby pitch angles are locally determined as a function of scale and position, in contrast to traditional Fourier transform spectral analyses which fit to average pitch angles for whole galaxies. The new analysis is based on a correlation between pieces of a galaxy in circular windows of (lnR,θ) space and logarithmic spirals with various pitch angles. The diameter of the windows is varied to study different scales. The result is a best-fit pitch angle to the spiral structure as amore » function of position and scale, or a distribution function of pitch angles as a function of scale for a given galactic region or area. We apply the method to determine the distribution of pitch angles in the arm and interarm regions of these two galaxies. In the arms, the method reproduces the known pitch angles for the main spirals on a large scale, but also shows higher pitch angles on smaller scales resulting from dust feathers. For the interarms, there is a broad distribution of pitch angles representing the continuation and evolution of the spiral arm feathers as the flow moves into the interarm regions. Our method shows a multiplicity of spiral structures on different scales, as expected from gas flow processes in a gravitating, turbulent and shearing interstellar medium. We also present results for M81 using classical 1D and 2D Fourier transforms, together with a new correlation method, which shows good agreement with conventional 2D Fourier transforms.« less
Belide, Srinivas; Vanhercke, Thomas; Petrie, James Robertson; Singh, Surinder Pal
2017-01-01
Sorghum ( Sorghum bicolor L.) is one of the world's most important cereal crops grown for multiple applications and has been identified as a potential biofuel crop. Despite several decades of study, sorghum has been widely considered as a recalcitrant major crop for transformation due to accumulation of phenolic compounds, lack of model genotypes, low regeneration frequency and loss of regeneration potential through sub-cultures. Among different explants used for genetic transformation of sorghum, immature embryos are ideal over other explants. However, the continuous supply of quality immature embryos for transformation is labour intensive and expensive. In addition, transformation efficiencies are also influenced by environmental conditions (light and temperature). Despite these challenges, immature embryos remain the predominant choice because of their success rate and also due to non-availability of other dependable explants without compromising the transformation efficiency. We report here a robust genetic transformation method for sorghum (Tx430) using differentiating embryogenic calli (DEC) with nodular structures induced from immature embryos and maintained for more than a year without losing regeneration potential on modified MS media. The addition of lipoic acid (LA) to callus induction media along with optimized growth regulators increased callus induction frequency from 61.3 ± 3.2 to 79 ± 6.5% from immature embryos (1.5-2.0 mm in length) isolated 12-15 days after pollination. Similarly, the regeneration efficiency and the number of shoots from DEC tissue was enhanced by LA. The optimized regeneration system in combination with particle bombardment resulted in an average transformation efficiency (TE) of 27.2 or 46.6% based on the selection strategy, 25% to twofold higher TE than published reports in Tx430. Up to 100% putative transgenic shoots were positive for npt - II by PCR and 48% of events had < 3 copies of transgenes as determined by digital droplet PCR. Reproducibility of this method was demonstrated by generating ~ 800 transgenic plants using 10 different gene constructs. This protocol demonstrates significant improvements in both efficiency and ease of use over existing sorghum transformation methods using PDS, also enables quick hypothesis testing in the production of various high value products in sorghum.
Multi-oriented windowed harmonic phase reconstruction for robust cardiac strain imaging.
Cordero-Grande, Lucilio; Royuela-del-Val, Javier; Sanz-Estébanez, Santiago; Martín-Fernández, Marcos; Alberola-López, Carlos
2016-04-01
The purpose of this paper is to develop a method for direct estimation of the cardiac strain tensor by extending the harmonic phase reconstruction on tagged magnetic resonance images to obtain more precise and robust measurements. The extension relies on the reconstruction of the local phase of the image by means of the windowed Fourier transform and the acquisition of an overdetermined set of stripe orientations in order to avoid the phase interferences from structures outside the myocardium and the instabilities arising from the application of a gradient operator. Results have shown that increasing the number of acquired orientations provides a significant improvement in the reproducibility of the strain measurements and that the acquisition of an extended set of orientations also improves the reproducibility when compared with acquiring repeated samples from a smaller set of orientations. Additionally, biases in local phase estimation when using the original harmonic phase formulation are greatly diminished by the one here proposed. The ideas here presented allow the design of new methods for motion sensitive magnetic resonance imaging, which could simultaneously improve the resolution, robustness and accuracy of motion estimates. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Glenn Edward; Song, Xuehang; Ye, Ming
A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. Themore » spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.« less
Zhang, Jing; Liang, Wensheng; Yu, Wei; Yu, Shuwen; Wu, Yiliang; Guo, Xin; Liu, Shengzhong Frank; Li, Can
2018-05-28
The solvent-engineering method is widely used to fabricate top-performing perovskite solar cells, which, however, usually exhibit inferior reproducibility. Herein, a two-stage annealing (TSA) strategy is demonstrated for processing of perovskite films, namely, annealing the intermediate phase at 60 °C for the first stage then at 100 °C for the second stage. Compared to conventional direct annealing temperature (DHA) at 100 °C, using this strategy, MAPbI 3 films become more controllable, leading to superior film uniformity and device reproducibility with the champion device efficiency reaching 19.8%. More specifically, the coefficient of variation of efficiency for 49 cells is reduced to 5.9%, compared to 9.8% for that using DHA. The TSA process is carefully studied using Fourier transform infrared spectroscopy, X-ray diffraction, and UV-vis absorption spectroscopy. It is found that in comparison with DHA the formation of hydrogen bonding and crystallization of perovskite are much slower and can be better controlled when using TSA. The improvements in film uniformity and device reproducibility are attributed to: 1) controllable MAPbI 3 crystal growth stemming from the progressive formation of hydrogen bonding between methylammonium and halide; 2) suppression of intermediate phase film dewetting, which is believed to be due to its decreased mobility at the initial low-temperature annealing stage. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Joyce, Priya; Kuwahata, Melissa; Turner, Nicole; Lakshmanan, Prakash
2010-02-01
A reproducible method for transformation of sugarcane using various strains of Agrobacterium tumefaciens (A. tumefaciens) (AGL0, AGL1, EHA105 and LBA4404) has been developed. The selection system and co-cultivation medium were the most important factors determining the success of transformation and transgenic plant regeneration. Plant regeneration at a frequency of 0.8-4.8% occurred only when callus was transformed with A. tumefaciens carrying a newly constructed superbinary plasmid containing neomycin phosphotransferase (nptII) and beta-glucuronidase (gusA) genes, both driven by the maize ubiquitin (ubi-1) promoter. Regeneration was successful in plants carrying the nptII gene but not the hygromycin phosphotransferase (hph) gene. NptII gene selection was imposed at a concentration of 150 mg/l paromomycin sulphate and applied either immediately or 4 days after the co-cultivation period. Co-cultivation on Murashige and Skoog (MS)-based medium for a period of 4 days produced the highest number of transgenic plants. Over 200 independent transgenic lines were created using this protocol. Regenerated plants appeared phenotypically normal and contained both gusA and nptII genes. Southern blot analysis revealed 1-3 transgene insertion events that were randomly integrated in the majority of the plants produced.
NASA Astrophysics Data System (ADS)
Haq, Quazi M. I.; Mabood, Fazal; Naureen, Zakira; Al-Harrasi, Ahmed; Gilani, Sayed A.; Hussain, Javid; Jabeen, Farah; Khan, Ajmal; Al-Sabari, Ruqaya S. M.; Al-khanbashi, Fatema H. S.; Al-Fahdi, Amira A. M.; Al-Zaabi, Ahoud K. A.; Al-Shuraiqi, Fatma A. M.; Al-Bahaisi, Iman M.
2018-06-01
Nucleic acid & serology based methods have revolutionized plant disease detection, however, they are not very reliable at asymptomatic stage, especially in case of pathogen with systemic infection, in addition, they need at least 1-2 days for sample harvesting, processing, and analysis. In this study, two reflectance spectroscopies i.e. Near Infrared reflectance spectroscopy (NIR) and Fourier-Transform-Infrared spectroscopy with Attenuated Total Reflection (FT-IR, ATR) coupled with multivariate exploratory methods like Principle Component Analysis (PCA) and Partial least square discriminant analysis (PLS-DA) have been deployed to detect begomovirus infection in papaya leaves. The application of those techniques demonstrates that they are very useful for robust in vivo detection of plant begomovirus infection. These methods are simple, sensitive, reproducible, precise, and do not require any lengthy samples preparation procedures.
Kaehler, G; Wagner, A J
2013-06-01
Current implementations of fluctuating ideal-gas descriptions with the lattice Boltzmann methods are based on a fluctuation dissipation theorem, which, while greatly simplifying the implementation, strictly holds only for zero mean velocity and small fluctuations. We show how to derive the fluctuation dissipation theorem for all k, which was done only for k=0 in previous derivations. The consistent derivation requires, in principle, locally velocity-dependent multirelaxation time transforms. Such an implementation is computationally prohibitively expensive but, with a small computational trick, it is feasible to reproduce the correct FDT without overhead in computation time. It is then shown that the previous standard implementations perform poorly for non vanishing mean velocity as indicated by violations of Galilean invariance of measured structure factors. Results obtained with the method introduced here show a significant reduction of the Galilean invariance violations.
Effect of a core-softened O-O interatomic interaction on the shock compression of fused silica
NASA Astrophysics Data System (ADS)
Izvekov, Sergei; Weingarten, N. Scott; Byrd, Edward F. C.
2018-03-01
Isotropic soft-core potentials have attracted considerable attention due to their ability to reproduce thermodynamic, dynamic, and structural anomalies observed in tetrahedral network-forming compounds such as water and silica. The aim of the present work is to assess the relevance of effective core-softening pertinent to the oxygen-oxygen interaction in silica to the thermodynamics and phase change mechanisms that occur in shock compressed fused silica. We utilize the MD simulation method with a recently published numerical interatomic potential derived from an ab initio MD simulation of liquid silica via force-matching. The resulting potential indicates an effective shoulder-like core-softening of the oxygen-oxygen repulsion. To better understand the role of the core-softening we analyze two derivative force-matching potentials in which the soft-core is replaced with a repulsive core either in the three-body potential term or in all the potential terms. Our analysis is further augmented by a comparison with several popular empirical models for silica that lack an explicit core-softening. The first outstanding feature of shock compressed glass reproduced with the soft-core models but not with the other models is that the shock compression values at pressures above 20 GPa are larger than those observed under hydrostatic compression (an anomalous shock Hugoniot densification). Our calculations indicate the occurrence of a phase transformation along the shock Hugoniot that we link to the O-O repulsion core-softening. The phase transformation is associated with a Hugoniot temperature reversal similar to that observed experimentally. With the soft-core models, the phase change is an isostructural transformation between amorphous polymorphs with no associated melting event. We further examine the nature of the structural transformation by comparing it to the Hugoniot calculations for stishovite. For stishovite, the Hugoniot exhibits temperature reversal and associated phase transformation, which is a transition to a disordered phase (liquid or dense amorphous), regardless of whether or not the model accounts for core-softening. The onset pressures of the transformation predicted by different models show a wide scatter within 60-110 GPa; for potentials without core-softening, the onset pressure is much higher than 110 GPa. Our results show that the core-softening of the interaction in the oxygen subsystem of silica is the key mechanism for the structural transformation and thermodynamics in shock compressed silica. These results may provide an important contribution to a unified picture of anomalous response to shock compression observed in other network-forming oxides and single-component systems with core-softening of effective interactions.
Overview of Compact Toroidal Hybrid research program progress and plans
NASA Astrophysics Data System (ADS)
Maurer, David; Ennis, David; Hanson, James; Hartwell, Gregory; Herfindal, Jeffrey; Knowlton, Stephen; Ma, Xingxing; Pandya, Mihir; Roberds, Nicholas; Ross, Kevin; Traverso, Peter
2016-10-01
disruptive behavior on the level of applied 3D magnetic shaping; (2) test and advance the V3FIT reconstruction code and NIMROD modeling of CTH; and (3) study the implementation of an island divertor. Progress towards these goals and other developments are summarized. The disruptive density limit exceeds the Greenwald limit as the vacuum transform is increased, but a threshold for avoidance is not observed. Low- q disruptions, with 1.1 < q (a) <2.0, cease to occur if the vacuum transform is raised above 0.07. Application of vacuum transform can reduce and eliminate the vertical drift of elongated discharges that would otherwise be vertically unstable. Reconstructions using external magnetics give accurate estimates for quantities near the plasma boundary, and internal diagnostics have been implemented to extend the range of accuracy into the plasma core. Sawtooth behavior has been reproducibly modified with external transform and NIMROD is used to model these observations and reproduces experimental trends. An island divertor design has begun with connection length studies to model energy deposition on divertor plates located in an edge 1/3 island as well as the study of a non-resonant divertor configuration. This work is supported by U.S. Department of Energy Grant No. DE-FG02-00ER54610.
NASA Astrophysics Data System (ADS)
Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin
2015-03-01
The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.
Lindsay, A E; Spoonmore, R T; Tzou, J C
2016-10-01
A hybrid asymptotic-numerical method is presented for obtaining an asymptotic estimate for the full probability distribution of capture times of a random walker by multiple small traps located inside a bounded two-dimensional domain with a reflecting boundary. As motivation for this study, we calculate the variance in the capture time of a random walker by a single interior trap and determine this quantity to be comparable in magnitude to the mean. This implies that the mean is not necessarily reflective of typical capture times and that the full density must be determined. To solve the underlying diffusion equation, the method of Laplace transforms is used to obtain an elliptic problem of modified Helmholtz type. In the limit of vanishing trap sizes, each trap is represented as a Dirac point source that permits the solution of the transform equation to be represented as a superposition of Helmholtz Green's functions. Using this solution, we construct asymptotic short-time solutions of the first-passage-time density, which captures peaks associated with rapid capture by the absorbing traps. When numerical evaluation of the Helmholtz Green's function is employed followed by numerical inversion of the Laplace transform, the method reproduces the density for larger times. We demonstrate the accuracy of our solution technique with a comparison to statistics obtained from a time-dependent solution of the diffusion equation and discrete particle simulations. In particular, we demonstrate that the method is capable of capturing the multimodal behavior in the capture time density that arises when the traps are strategically arranged. The hybrid method presented can be applied to scenarios involving both arbitrary domains and trap shapes.
Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition
NASA Astrophysics Data System (ADS)
Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.
2005-12-01
Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.
Grate, Jay W; Gonzalez, Jhanis J; O'Hara, Matthew J; Kellogg, Cynthia M; Morrison, Samuel S; Koppenaal, David W; Chan, George C-Y; Mao, Xianglei; Zorba, Vassilia; Russo, Richard E
2017-09-08
Solid sampling and analysis methods, such as laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), are challenged by matrix effects and calibration difficulties. Matrix-matched standards for external calibration are seldom available and it is difficult to distribute spikes evenly into a solid matrix as internal standards. While isotopic ratios of the same element can be measured to high precision, matrix-dependent effects in the sampling and analysis process frustrate accurate quantification and elemental ratio determinations. Here we introduce a potentially general solid matrix transformation approach entailing chemical reactions in molten ammonium bifluoride (ABF) salt that enables the introduction of spikes as tracers or internal standards. Proof of principle experiments show that the decomposition of uranium ore in sealed PFA fluoropolymer vials at 230 °C yields, after cooling, new solids suitable for direct solid sampling by LA. When spikes are included in the molten salt reaction, subsequent LA-ICP-MS sampling at several spots indicate that the spikes are evenly distributed, and that U-235 tracer dramatically improves reproducibility in U-238 analysis. Precisions improved from 17% relative standard deviation for U-238 signals to 0.1% for the ratio of sample U-238 to spiked U-235, a factor of over two orders of magnitude. These results introduce the concept of solid matrix transformation (SMT) using ABF, and provide proof of principle for a new method of incorporating internal standards into a solid for LA-ICP-MS. This new approach, SMT-LA-ICP-MS, provides opportunities to improve calibration and quantification in solids based analysis. Looking forward, tracer addition to transformed solids opens up LA-based methods to analytical methodologies such as standard addition, isotope dilution, preparation of matrix-matched solid standards, external calibration, and monitoring instrument drift against external calibration standards.
Derivative component analysis for mass spectral serum proteomic profiles.
Han, Henry
2014-01-01
As a promising way to transform medicine, mass spectrometry based proteomics technologies have seen a great progress in identifying disease biomarkers for clinical diagnosis and prognosis. However, there is a lack of effective feature selection methods that are able to capture essential data behaviors to achieve clinical level disease diagnosis. Moreover, it faces a challenge from data reproducibility, which means that no two independent studies have been found to produce same proteomic patterns. Such reproducibility issue causes the identified biomarker patterns to lose repeatability and prevents it from real clinical usage. In this work, we propose a novel machine-learning algorithm: derivative component analysis (DCA) for high-dimensional mass spectral proteomic profiles. As an implicit feature selection algorithm, derivative component analysis examines input proteomics data in a multi-resolution approach by seeking its derivatives to capture latent data characteristics and conduct de-noising. We further demonstrate DCA's advantages in disease diagnosis by viewing input proteomics data as a profile biomarker via integrating it with support vector machines to tackle the reproducibility issue, besides comparing it with state-of-the-art peers. Our results show that high-dimensional proteomics data are actually linearly separable under proposed derivative component analysis (DCA). As a novel multi-resolution feature selection algorithm, DCA not only overcomes the weakness of the traditional methods in subtle data behavior discovery, but also suggests an effective resolution to overcoming proteomics data's reproducibility problem and provides new techniques and insights in translational bioinformatics and machine learning. The DCA-based profile biomarker diagnosis makes clinical level diagnostic performances reproducible across different proteomic data, which is more robust and systematic than the existing biomarker discovery based diagnosis. Our findings demonstrate the feasibility and power of the proposed DCA-based profile biomarker diagnosis in achieving high sensitivity and conquering the data reproducibility issue in serum proteomics. Furthermore, our proposed derivative component analysis suggests the subtle data characteristics gleaning and de-noising are essential in separating true signals from red herrings for high-dimensional proteomic profiles, which can be more important than the conventional feature selection or dimension reduction. In particular, our profile biomarker diagnosis can be generalized to other omics data for derivative component analysis (DCA)'s nature of generic data analysis.
NASA Astrophysics Data System (ADS)
Caliari, Marco; Zuccher, Simone
2017-04-01
Although Fourier series approximation is ubiquitous in computational physics owing to the Fast Fourier Transform (FFT) algorithm, efficient techniques for the fast evaluation of a three-dimensional truncated Fourier series at a set of arbitrary points are quite rare, especially in MATLAB language. Here we employ the Nonequispaced Fast Fourier Transform (NFFT, by J. Keiner, S. Kunis, and D. Potts), a C library designed for this purpose, and provide a Matlab® and GNU Octave interface that makes NFFT easily available to the Numerical Analysis community. We test the effectiveness of our package in the framework of quantum vortex reconnections, where pseudospectral Fourier methods are commonly used and local high resolution is required in the post-processing stage. We show that the efficient evaluation of a truncated Fourier series at arbitrary points provides excellent results at a computational cost much smaller than carrying out a numerical simulation of the problem on a sufficiently fine regular grid that can reproduce comparable details of the reconnecting vortices.
Khanmohammadi, Mohammadreza; Bagheri Garmarudi, Amir; Samani, Simin; Ghasemi, Keyvan; Ashuri, Ahmad
2011-06-01
Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) microspectroscopy was applied for detection of colon cancer according to the spectral features of colon tissues. Supervised classification models can be trained to identify the tissue type based on the spectroscopic fingerprint. A total of 78 colon tissues were used in spectroscopy studies. Major spectral differences were observed in 1,740-900 cm(-1) spectral region. Several chemometric methods such as analysis of variance (ANOVA), cluster analysis (CA) and linear discriminate analysis (LDA) were applied for classification of IR spectra. Utilizing the chemometric techniques, clear and reproducible differences were observed between the spectra of normal and cancer cases, suggesting that infrared microspectroscopy in conjunction with spectral data processing would be useful for diagnostic classification. Using LDA technique, the spectra were classified into cancer and normal tissue classes with an accuracy of 95.8%. The sensitivity and specificity was 100 and 93.1%, respectively.
NASA Astrophysics Data System (ADS)
Azarnavid, Babak; Parand, Kourosh; Abbasbandy, Saeid
2018-06-01
This article discusses an iterative reproducing kernel method with respect to its effectiveness and capability of solving a fourth-order boundary value problem with nonlinear boundary conditions modeling beams on elastic foundations. Since there is no method of obtaining reproducing kernel which satisfies nonlinear boundary conditions, the standard reproducing kernel methods cannot be used directly to solve boundary value problems with nonlinear boundary conditions as there is no knowledge about the existence and uniqueness of the solution. The aim of this paper is, therefore, to construct an iterative method by the use of a combination of reproducing kernel Hilbert space method and a shooting-like technique to solve the mentioned problems. Error estimation for reproducing kernel Hilbert space methods for nonlinear boundary value problems have yet to be discussed in the literature. In this paper, we present error estimation for the reproducing kernel method to solve nonlinear boundary value problems probably for the first time. Some numerical results are given out to demonstrate the applicability of the method.
The Cancer Target Discovery and Development (CTD^2) Network was established to accelerate the transformation of "Big Data" into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding.
Held, Christian; Wenzel, Jens; Webel, Rike; Marschall, Manfred; Lang, Roland; Palmisano, Ralf; Wittenberg, Thomas
2011-01-01
In order to improve reproducibility and objectivity of fluorescence microscopy based experiments and to enable the evaluation of large datasets, flexible segmentation methods are required which are able to adapt to different stainings and cell types. This adaption is usually achieved by the manual adjustment of the segmentation methods parameters, which is time consuming and challenging for biologists with no knowledge on image processing. To avoid this, parameters of the presented methods automatically adapt to user generated ground truth to determine the best method and the optimal parameter setup. These settings can then be used for segmentation of the remaining images. As robust segmentation methods form the core of such a system, the currently used watershed transform based segmentation routine is replaced by a fast marching level set based segmentation routine which incorporates knowledge on the cell nuclei. Our evaluations reveal that incorporation of multimodal information improves segmentation quality for the presented fluorescent datasets.
Deus Ex Machina: Tradition, Technology, and the Chicanafuturist Art of Marion C. Martinez
ERIC Educational Resources Information Center
Ramirez, Catherine S.
2004-01-01
The visual art of Marion C. Martinez is examined. Through technology, Martinez reproduces and transforms traditional Indo-Hispanic art forms, at the same time, underscores New Mexico's history as a dumping ground for technological waste.
Alegro, Maryana; Theofilas, Panagiotis; Nguy, Austin; Castruita, Patricia A; Seeley, William; Heinsen, Helmut; Ushizima, Daniela M; Grinberg, Lea T
2017-04-15
Immunofluorescence (IF) plays a major role in quantifying protein expression in situ and understanding cell function. It is widely applied in assessing disease mechanisms and in drug discovery research. Automation of IF analysis can transform studies using experimental cell models. However, IF analysis of postmortem human tissue relies mostly on manual interaction, often subjected to low-throughput and prone to error, leading to low inter and intra-observer reproducibility. Human postmortem brain samples challenges neuroscientists because of the high level of autofluorescence caused by accumulation of lipofuscin pigment during aging, hindering systematic analyses. We propose a method for automating cell counting and classification in IF microscopy of human postmortem brains. Our algorithm speeds up the quantification task while improving reproducibility. Dictionary learning and sparse coding allow for constructing improved cell representations using IF images. These models are input for detection and segmentation methods. Classification occurs by means of color distances between cells and a learned set. Our method successfully detected and classified cells in 49 human brain images. We evaluated our results regarding true positive, false positive, false negative, precision, recall, false positive rate and F1 score metrics. We also measured user-experience and time saved compared to manual countings. We compared our results to four open-access IF-based cell-counting tools available in the literature. Our method showed improved accuracy for all data samples. The proposed method satisfactorily detects and classifies cells from human postmortem brain IF images, with potential to be generalized for applications in other counting tasks. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kumari, Priti; Kumari, Niraj; Jha, Anal K.; Singh, K. P.; Prasad, K.
2018-05-01
Green synthesis, characterizations and applications of nanoparticles have become an important branch of nanotechnology now a day. In this paper, green synthesis of silver nanoparticles (AgNPs) using the aqueous extract of Nyctanthes arbortristis as a reducing and stabilizing agent, has been discussed. Present synthetic method is very handy, cost-effective and reproducible. Formation of AgNPs was characterized by X-ray diffraction, dynamic light scattering, scanning electron microscopy and UV-visible spectroscopy techniques. The phytochemicals responsible for nano-transformation were principally flavonoids, phenols and glycosides present in the leaves. Further, the dose dependent cytotoxicity assay of biosynthesized AgNPs against THP-1 human leukemia cell lines showed the encouraging results.
NASA Astrophysics Data System (ADS)
Parke, L.; Hooper, I. R.; Hicken, R. J.; Dancer, C. E. J.; Grant, P. S.; Youngs, I. J.; Sambles, J. R.; Hibbins, A. P.
2013-10-01
A cold-pressing technique has been developed for fabricating composites composed of a polytetrafluoroethylene-polymer matrix and a wide range of volume-fractions of MnZn-ferrite filler (0%-80%). The electromagnetic properties at centimetre wavelengths of all prepared composites exhibited good reproducibility, with the most heavily loaded composites possessing simultaneously high permittivity (180 ± 10) and permeability (23 ± 2). The natural logarithm of both the relative complex permittivity and permeability shows an approximately linear dependence with the volume fraction of ferrite. Thus, this simple method allows for the manufacture of bespoke materials required in the design and construction of devices based on the principles of transformation optics.
Alegro, Maryana; Theofilas, Panagiotis; Nguy, Austin; Castruita, Patricia A.; Seeley, William; Heinsen, Helmut; Ushizima, Daniela M.
2017-01-01
Background Immunofluorescence (IF) plays a major role in quantifying protein expression in situ and understanding cell function. It is widely applied in assessing disease mechanisms and in drug discovery research. Automation of IF analysis can transform studies using experimental cell models. However, IF analysis of postmortem human tissue relies mostly on manual interaction, often subjected to low-throughput and prone to error, leading to low inter and intra-observer reproducibility. Human postmortem brain samples challenges neuroscientists because of the high level of autofluorescence caused by accumulation of lipofuscin pigment during aging, hindering systematic analyses. We propose a method for automating cell counting and classification in IF microscopy of human postmortem brains. Our algorithm speeds up the quantification task while improving reproducibility. New method Dictionary learning and sparse coding allow for constructing improved cell representations using IF images. These models are input for detection and segmentation methods. Classification occurs by means of color distances between cells and a learned set. Results Our method successfully detected and classified cells in 49 human brain images. We evaluated our results regarding true positive, false positive, false negative, precision, recall, false positive rate and F1 score metrics. We also measured user-experience and time saved compared to manual countings. Comparison with existing methods We compared our results to four open-access IF-based cell-counting tools available in the literature. Our method showed improved accuracy for all data samples. Conclusion The proposed method satisfactorily detects and classifies cells from human postmortem brain IF images, with potential to be generalized for applications in other counting tasks. PMID:28267565
The reliability of clinical decisions based on the cervical vertebrae maturation staging method.
Sohrabi, Aydin; Babay Ahari, Sahar; Moslemzadeh, Hossein; Rafighi, Ali; Aghazadeh, Zahra
2016-02-01
Of the various techniques used to determine the optimum timing for growth modification treatments, the cervical vertebrae maturation method has great advantages, including validity and no need for extra X-ray exposure. Recently, the reproducibility of this method has been questioned. The aim of this study was to investigate the cause of poor reproducibility of this method and to assess the reproducibility of the clinical decisions made based on it. Seventy lateral cephalograms of Iranian patients aged 9‒15 years were observed twice by five experienced orthodontists. In addition to determining the developmental stage, each single parameter involved in this method was assessed in terms of inter- and intra-observer reproducibility. In order to evaluate the reproducibility of clinical decisions based on this method, cervical vertebrae maturation staging (CVMS) I and II were considered as phase 1 and CVMS IV and V were considered as phase 3. By considering the clinical approach of the CVMS method, inter-observer reproducibility of this method increased from 0.48 to 0.61 (moderate to substantial) and intra-observer reproducibility enhanced from 0.72 to 0.74. 1. Complete visualization of the first four cervical vertebrae was an inclusion criterion, which also limits the clinical application of CVMS method. 2. These results can be generalized when determining growth modification treatments solely for Class II patients. Difficulty in determining the morphology of C3 and C4 leads to poor reproducibility of the CVMS method. Despite this, it has acceptable reproducibility in determining the timing of functional treatment for Class II patients. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Open data and digital morphology
Davies, Thomas G.; Cunningham, John A.; Asher, Robert J.; Bates, Karl T.; Bengtson, Stefan; Benson, Roger B. J.; Boyer, Doug M.; Braga, José; Dong, Xi-Ping; Evans, Alistair R.; Friedman, Matt; Garwood, Russell J.; Goswami, Anjali; Hutchinson, John R.; Jeffery, Nathan S.; Lebrun, Renaud; Martínez-Pérez, Carlos; O'Higgins, Paul M.; Orliac, Maëva; Rowe, Timothy B.; Sánchez-Villagra, Marcelo R.; Shubin, Neil H.; Starck, J. Matthias; Stringer, Chris; Summers, Adam P.; Sutton, Mark D.; Walsh, Stig A.; Weisbecker, Vera; Witmer, Lawrence M.; Wroe, Stephen; Yin, Zongjun
2017-01-01
Over the past two decades, the development of methods for visualizing and analysing specimens digitally, in three and even four dimensions, has transformed the study of living and fossil organisms. However, the initial promise that the widespread application of such methods would facilitate access to the underlying digital data has not been fully achieved. The underlying datasets for many published studies are not readily or freely available, introducing a barrier to verification and reproducibility, and the reuse of data. There is no current agreement or policy on the amount and type of data that should be made available alongside studies that use, and in some cases are wholly reliant on, digital morphology. Here, we propose a set of recommendations for minimum standards and additional best practice for three-dimensional digital data publication, and review the issues around data storage, management and accessibility. PMID:28404779
Open data and digital morphology.
Davies, Thomas G; Rahman, Imran A; Lautenschlager, Stephan; Cunningham, John A; Asher, Robert J; Barrett, Paul M; Bates, Karl T; Bengtson, Stefan; Benson, Roger B J; Boyer, Doug M; Braga, José; Bright, Jen A; Claessens, Leon P A M; Cox, Philip G; Dong, Xi-Ping; Evans, Alistair R; Falkingham, Peter L; Friedman, Matt; Garwood, Russell J; Goswami, Anjali; Hutchinson, John R; Jeffery, Nathan S; Johanson, Zerina; Lebrun, Renaud; Martínez-Pérez, Carlos; Marugán-Lobón, Jesús; O'Higgins, Paul M; Metscher, Brian; Orliac, Maëva; Rowe, Timothy B; Rücklin, Martin; Sánchez-Villagra, Marcelo R; Shubin, Neil H; Smith, Selena Y; Starck, J Matthias; Stringer, Chris; Summers, Adam P; Sutton, Mark D; Walsh, Stig A; Weisbecker, Vera; Witmer, Lawrence M; Wroe, Stephen; Yin, Zongjun; Rayfield, Emily J; Donoghue, Philip C J
2017-04-12
Over the past two decades, the development of methods for visualizing and analysing specimens digitally, in three and even four dimensions, has transformed the study of living and fossil organisms. However, the initial promise that the widespread application of such methods would facilitate access to the underlying digital data has not been fully achieved. The underlying datasets for many published studies are not readily or freely available, introducing a barrier to verification and reproducibility, and the reuse of data. There is no current agreement or policy on the amount and type of data that should be made available alongside studies that use, and in some cases are wholly reliant on, digital morphology. Here, we propose a set of recommendations for minimum standards and additional best practice for three-dimensional digital data publication, and review the issues around data storage, management and accessibility. © 2017 The Authors.
Determining the Pressure Shift of Helium I Lines Using White Dwarf Stars
NASA Astrophysics Data System (ADS)
Camarota, Lawrence
This dissertation explores the non-Doppler shifting of Helium lines in the high pressure conditions of a white dwarf photosphere. In particular, this dissertation seeks to mathematically quantify the shift in a way that is simple to reproduce and account for in future studies without requiring prior knowledge of the star's bulk properties (mass, radius, temperature, etc.). Two main methods will be used in this analysis. First, the spectral line will be quantified with a continuous wavelet transformation, and the components will be used in a chi2 minimizing linear regression to predict the shift. Second, the position of the lines will be calculated using a best-fit Levy-alpha line function. These techniques stand in contrast to traditional methods of quantifying the center of often broad spectral lines, which usually assume symmetry on the parts of the lines.
Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.
Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda
2013-01-01
How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.
Asymmetric information capacities of reciprocal pairs of quantum channels
NASA Astrophysics Data System (ADS)
Rosati, Matteo; Giovannetti, Vittorio
2018-05-01
Reciprocal pairs of quantum channels are defined as completely positive transformations which admit a rigid, distance-preserving, yet not completely positive transformation that allows one to reproduce the outcome of one from the corresponding outcome of the other. From a classical perspective these transmission lines should exhibit the same communication efficiency. This is no longer the case in the quantum setting: explicit asymmetric behaviors are reported studying the classical communication capacities of reciprocal pairs of depolarizing and Weyl-covariant channels.
Wittmann, Dominik M; Krumsiek, Jan; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Klamt, Steffen; Theis, Fabian J
2009-01-01
Background The understanding of regulatory and signaling networks has long been a core objective in Systems Biology. Knowledge about these networks is mainly of qualitative nature, which allows the construction of Boolean models, where the state of a component is either 'off' or 'on'. While often able to capture the essential behavior of a network, these models can never reproduce detailed time courses of concentration levels. Nowadays however, experiments yield more and more quantitative data. An obvious question therefore is how qualitative models can be used to explain and predict the outcome of these experiments. Results In this contribution we present a canonical way of transforming Boolean into continuous models, where the use of multivariate polynomial interpolation allows transformation of logic operations into a system of ordinary differential equations (ODE). The method is standardized and can readily be applied to large networks. Other, more limited approaches to this task are briefly reviewed and compared. Moreover, we discuss and generalize existing theoretical results on the relation between Boolean and continuous models. As a test case a logical model is transformed into an extensive continuous ODE model describing the activation of T-cells. We discuss how parameters for this model can be determined such that quantitative experimental results are explained and predicted, including time-courses for multiple ligand concentrations and binding affinities of different ligands. This shows that from the continuous model we may obtain biological insights not evident from the discrete one. Conclusion The presented approach will facilitate the interaction between modeling and experiments. Moreover, it provides a straightforward way to apply quantitative analysis methods to qualitatively described systems. PMID:19785753
NASA Astrophysics Data System (ADS)
Waldman, Robin; Somot, Samuel; Herrmann, Marine; Bosse, Anthony; Caniaux, Guy; Estournel, Claude; Houpert, Loic; Prieur, Louis; Sevault, Florence; Testor, Pierre
2017-02-01
The northwestern Mediterranean Sea is a well-observed ocean deep convection site. Winter 2012-2013 was an intense and intensely documented dense water formation (DWF) event. We evaluate this DWF event in an ensemble configuration of the regional ocean model NEMOMED12. We then assess for the first time the impact of ocean intrinsic variability on DWF with a novel perturbed initial state ensemble method. Finally, we identify the main physical mechanisms driving water mass transformations. NEMOMED12 reproduces accurately the deep convection chronology between late January and March, its location off the Gulf of Lions although with a southward shift and its magnitude. It fails to reproduce the Western Mediterranean Deep Waters salinification and warming, consistently with too strong a surface heat loss. The Ocean Intrinsic Variability modulates half of the DWF area, especially in the open-sea where the bathymetry slope is low. It modulates marginally (3-5%) the integrated DWF rate, but its increase with time suggests its impact could be larger at interannual timescales. We conclude that ensemble frameworks are necessary to evaluate accurately numerical simulations of DWF. Each phase of DWF has distinct diapycnal and thermohaline regimes: during preconditioning, the Mediterranean thermohaline circulation is driven by exchanges with the Algerian basin. During the intense mixing phase, surface heat fluxes trigger deep convection and internal mixing largely determines the resulting deep water properties. During restratification, lateral exchanges and internal mixing are enhanced. Finally, isopycnal mixing was shown to play a large role in water mass transformations during the preconditioning and restratification phases.
Statistical Validation of Image Segmentation Quality Based on a Spatial Overlap Index1
Zou, Kelly H.; Warfield, Simon K.; Bharatha, Aditya; Tempany, Clare M.C.; Kaus, Michael R.; Haker, Steven J.; Wells, William M.; Jolesz, Ferenc A.; Kikinis, Ron
2005-01-01
Rationale and Objectives To examine a statistical validation method based on the spatial overlap between two sets of segmentations of the same anatomy. Materials and Methods The Dice similarity coefficient (DSC) was used as a statistical validation metric to evaluate the performance of both the reproducibility of manual segmentations and the spatial overlap accuracy of automated probabilistic fractional segmentation of MR images, illustrated on two clinical examples. Example 1: 10 consecutive cases of prostate brachytherapy patients underwent both preoperative 1.5T and intraoperative 0.5T MR imaging. For each case, 5 repeated manual segmentations of the prostate peripheral zone were performed separately on preoperative and on intraoperative images. Example 2: A semi-automated probabilistic fractional segmentation algorithm was applied to MR imaging of 9 cases with 3 types of brain tumors. DSC values were computed and logit-transformed values were compared in the mean with the analysis of variance (ANOVA). Results Example 1: The mean DSCs of 0.883 (range, 0.876–0.893) with 1.5T preoperative MRI and 0.838 (range, 0.819–0.852) with 0.5T intraoperative MRI (P < .001) were within and at the margin of the range of good reproducibility, respectively. Example 2: Wide ranges of DSC were observed in brain tumor segmentations: Meningiomas (0.519–0.893), astrocytomas (0.487–0.972), and other mixed gliomas (0.490–0.899). Conclusion The DSC value is a simple and useful summary measure of spatial overlap, which can be applied to studies of reproducibility and accuracy in image segmentation. We observed generally satisfactory but variable validation results in two clinical applications. This metric may be adapted for similar validation tasks. PMID:14974593
Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.
Zou, L; Bloebaum, R D; Bachus, K N
1997-01-01
Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.
Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.
Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo
2015-12-01
The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Formation and maintenance of tubular membrane projections: experiments and numerical calculations.
Umeda, Tamiki; Inaba, Takehiko; Ishijima, Akihiko; Takiguchi, Kingo; Hotani, Hirokazu
2008-01-01
To study the mechanical properties of lipid membranes, we manipulated liposomes by using a system comprising polystyrene beads and laser tweezers, and measured the force required to transform their shapes. When two beads pushed the membrane from inside, spherical liposomes transformed into a lemon-shape. Then a discontinuous shape transformation occurred to form a membrane tube from either end of the liposomes, and the force dropped drastically. We analyzed these processes using a mathematical model based on the bending elasticity of the membranes. Numerical calculations showed that when the bead size was taken into account, the model reproduced both the liposomal shape transformation and the force-extension relation. This result suggests that the size of the beads is responsible for the existence of a force barrier for the tube formation.
Macedo-Ojeda, Gabriela; Márquez-Sandoval, Fabiola; Fernández-Ballart, Joan; Vizmanos, Barbara
2016-01-01
The study of diet quality in a population provides information for the development of programs to improve nutritional status through better directed actions. The aim of this study was to assess the reproducibility and relative validity of a Mexican Diet Quality Index (ICDMx) for the assessment of the habitual diet of adults. The ICDMx was designed to assess the characteristics of a healthy diet using a validated semi-quantitative food frequency questionnaire (FFQ-Mx). Reproducibility was determined by comparing 2 ICDMx based on FFQs (one-year interval). Relative validity was assessed by comparing the ICDMx (2nd FFQ) with that estimated based on the intake averages from dietary records (nine days). The questionnaires were answered by 97 adults (mean age in years = 27.5, SD = 12.6). Pearson (r) and intraclass correlations (ICC) were calculated; Bland-Altman plots, Cohen’s κ coefficients and blood lipid determinations complemented the analysis. Additional analysis compared ICDMx scores with nutrients derived from dietary records, using a Pearson correlation. These nutrient intakes were transformed logarithmically to improve normality (log10) and adjusted according to energy, prior to analyses. The ICDMx obtained ICC reproducibility values ranged from 0.33 to 0.87 (23/24 items with significant correlations; mean = 0.63), while relative validity ranged from 0.26 to 0.79 (mean = 0.45). Bland-Altman plots showed a high level of agreement between methods. ICDMx scores were inversely correlated (p < 0.05) with total blood cholesterol (r = −0.33) and triglycerides (r = −0.22). ICDMx (as calculated from FFQs and DRs) obtained positive correlations with fiber, magnesium, potassium, retinol, thiamin, riboflavin, pyridoxine, and folate. The ICDMx obtained acceptable levels of reproducibility and relative validity in this population. It can be useful for population nutritional surveillance and to assess the changes resulting from the implementation of nutritional interventions. PMID:27563921
A new fractional wavelet transform
NASA Astrophysics Data System (ADS)
Dai, Hongzhe; Zheng, Zhibao; Wang, Wei
2017-03-01
The fractional Fourier transform (FRFT) is a potent tool to analyze the time-varying signal. However, it fails in locating the fractional Fourier domain (FRFD)-frequency contents which is required in some applications. A novel fractional wavelet transform (FRWT) is proposed to solve this problem. It displays the time and FRFD-frequency information jointly in the time-FRFD-frequency plane. The definition, basic properties, inverse transform and reproducing kernel of the proposed FRWT are considered. It has been shown that an FRWT with proper order corresponds to the classical wavelet transform (WT). The multiresolution analysis (MRA) associated with the developed FRWT, together with the construction of the orthogonal fractional wavelets are also presented. Three applications are discussed: the analysis of signal with time-varying frequency content, the FRFD spectrum estimation of signals that involving noise, and the construction of fractional Harr wavelet. Simulations verify the validity of the proposed FRWT.
Geometric processing of digital images of the planets
NASA Technical Reports Server (NTRS)
Edwards, Kathleen
1987-01-01
New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.
Pepper, sweet (Capsicum annuum).
Heidmann, Iris; Boutilier, Kim
2015-01-01
Capsicum (pepper) species are economically important crops that are recalcitrant to genetic transformation by Agrobacterium (Agrobacterium tumefaciens). A number of protocols for pepper transformation have been described but are not routinely applicable. The main bottleneck in pepper transformation is the low frequency of cells that are both susceptible for Agrobacterium infection and have the ability to regenerate. Here, we describe a protocol for the efficient regeneration of transgenic sweet pepper (C. annuum) through inducible activation of the BABY BOOM (BBM) AP2/ERF transcription factor. Using this approach, we can routinely achieve a transformation efficiency of at least 0.6 %. The main improvements in this protocol are the reproducibility in transforming different genotypes and the ability to produce fertile shoots. An added advantage of this protocol is that BBM activity can be induced subsequently in stable transgenic lines, providing a novel regeneration system for clonal propagation through somatic embryogenesis.
Application of Carbon Nanotubes for Plant Genetic Transformation
NASA Astrophysics Data System (ADS)
Burlaka, Olga M.; Pirko, Yaroslav V.; Yemets, Alla I.; Blume, Yaroslav B.
In this chapter, the current state of using carbon nanotubes (CNTs; single- and multi-walled) that have attracted great interdisciplinary interest in recent decades due to their peculiar properties for genetic transformation of prokaryotic and eukaryotic cells will be enlightened. The covalent and non-covalent surface chemistry for the CNT functionalization with focus on the potential applications of surface modifications in design of biocompatible CNTs will be discussed. The properties of CNTs that are favorable for biotechnological use and current status of technical approaches that allow the increase in biocompatibility and lower nanotoxicity of engineered CNTs will be described. Decisions proposed by non-covalent surface modification of CNTs will be discussed. Existing data concerning mechanisms of CNT cell entry and factors governing toxicity, cellular uptake, intracellular traffic, and biodegradation of CNTs along with bioavailability of molecular cargoes of loaded CNTs will be discussed. Eco-friendly production of water dispersions of biologically functionalized multi-walled and single-walled CNTs for use as nano-vehicles for the DNA delivery in plant genetic transformation of plants will be described. The background, advantages, and problems of using CNTs in developing of novel methods of genetic transformation, including plant genetic transformation, will be highlighted. Special attention will be paid to the limitations of conventional gene transfer techniques and promising features of CNT-based strategies having improved efficacy, reproducibility, and accuracy along with less time consumption. Issues impeding manipulation of CNTs such as entangled bundle formation, low water solubility, inert properties of pristine CNTs, etc., and ways to solve arising tasks will be overviewed.
Connectivism: 21st Century's New Learning Theory
ERIC Educational Resources Information Center
Kropf, Dorothy C.
2013-01-01
Transformed into a large collaborative learning environment, the Internet is comprised of information reservoirs namely, (a) online classrooms, (b) social networks, and (c) virtual reality or simulated communities, to expeditiously create, reproduce, share, and deliver information into the hands of educators and students. Most importantly, the…
Transformation Abilities: A Reanalysis and Confirmation of SOI Theory.
ERIC Educational Resources Information Center
Khattab, Ali-Maher; And Others
1987-01-01
Confirmatory factor analysis was used to reanalyze correlational data from selected variables in Guilford's Aptitudes Research Project. Results indicated Guilford's model reproduced the original correlation matrix more closely than other models. Most of Guilford's tests indicated high loadings on their hypothesized factors. (GDC)
2016-01-01
The Cancer Target Discovery and Development (CTD2) Network was established to accelerate the transformation of “Big Data” into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding. This manuscript represents a first attempt to delineate the challenges of supporting and confirming discoveries arising from the systematic analysis of large-scale data resources in a collaborative work environment and to provide a framework that would begin a community discussion to resolve these challenges. The Network implemented a multi-Tier framework designed to substantiate the biological and biomedical relevance as well as the reproducibility of data and insights resulting from its collaborative activities. The same approach can be used by the broad scientific community to drive development of novel therapeutic and biomarker strategies for cancer. PMID:27401613
Reproducibility and Transparency in Ocean-Climate Modeling
NASA Astrophysics Data System (ADS)
Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.
2015-12-01
Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.
Chowdhury, Supriyo; Basu, Arpita; Kundu, Surekha
2014-09-01
In spite of the economic importance of sesame (Sesamum indicum L.) and the recent availability of its genome sequence, a high-frequency transformation protocol is still not available. The only two existing Agrobacterium-mediated transformation protocols that are available have poor transformation efficiencies of less than 2%. In the present study, we report a high-frequency, simple, and reproducible transformation protocol for sesame. Transformation was done using de-embryonated cotyledons via somatic embryogenic stages. All the critical parameters of transformation, like incubation period of explants in pre-regeneration medium prior to infection by Agrobacterium tumefaciens, cocultivation period, concentrations of acetosyringone in cocultivation medium, kanamycin concentration, and concentration of plant hormones, including 6-benzylaminopurine, have been optimized. This protocol is superior to the two existing protocols in its high regeneration and transformation efficiencies. The transformed sesame lines have been tested by PCR, RT-PCR for neomycin phosphotransferase II gene expression, and β-glucuronidase (GUS) assay. The regeneration frequency and transformation efficiency are 57.33 and 42.66%, respectively. T0 and T1 generation transgenic plants were analyzed, and several T1 plants homozygous for the transgenes were obtained.
Alatar, Abdulrahman A; Faisal, Mohammad; Abdel-Salam, Eslam M; Canto, Tomas; Saquib, Quaiser; Javed, Saad B; El-Sheikh, Mohamed A; Al-Khedhairy, Abdulaziz A
2017-09-01
In the present study, we develop an efficient and reproducible in vitro regeneration system for two cultivars viz. , Jamila and Tomaland of Solanum lycopersicum L., an economically important vegetable crop throughout the world. Sterilization of seeds with 2.5% (v/v) NaOCl was found to be most effective, about 97% of seeds germinated on cotton in magenta box moistened with sterile half strength (½)Murashige and Skoog (MS) medium. Regeneration efficiency of cotyledonary leaf (CL) and cotyledonary node (CN) explants derived from 08 days old aseptic seedling were assessed on MS medium supplemented with different concentrations of auxins and cytokinin. CL explants were found more responsive in comparison to CN in both the cultivars. Types of basal media were also assessed and found to have a significant effect on shoot regeneration. Highest regeneration frequency and maximum number of shoots were standardized from CL explants on MS medium supplied with 6-benzyl adenine (BA; 5.0 µM), indole-3-butyric acid (IBA; 2.5 µM) and Kinetin (Kin; 10.0 µM). In vitro regenerated microshoots were rooted on ½MS medium containing 0.5 µM indole-3-butyric acid (IBA). Regenerated plantlets with well-developed roots and shoot system were successfully acclimated to ex vitro condition. Genetic uniformity of tissue culture raised plantlets was first time evaluated using flow cytometry and single primer amplification reaction (SPAR) methods viz ., DAMD and ISSR. No significant changes in ploidy level and nuclear DNA content profile were observed between in vitro propagated plants and normal plants of both the cultivars. Similarly, the SPAR analysis also revealed monomorphic banding patterns in regenerated plantlets of S. lycopersicum verifying their genetic uniformity and clonal fidelity. This efficient regeneration system can be used as a fast and reproducible method for genetic transformation of this important vegetable crop.
Data to knowledge: how to get meaning from your result.
Berman, Helen M; Gabanyi, Margaret J; Groom, Colin R; Johnson, John E; Murshudov, Garib N; Nicholls, Robert A; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D; Westbrook, John; Minor, Wladek
2015-01-01
Structural and functional studies require the development of sophisticated 'Big Data' technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB 'super' laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results.
Vanparys, Philippe; Corvi, Raffaella; Aardema, Marilyn J; Gribaldo, Laura; Hayashi, Makoto; Hoffmann, Sebastian; Schechtman, Leonard
2012-04-11
Two year rodent bioassays play a key role in the assessment of carcinogenic potential of chemicals to humans. The seventh amendment to the European Cosmetics Directive will ban in 2013 the marketing of cosmetic and personal care products that contain ingredients that have been tested in animal models. Thus 2-year rodent bioassays will not be available for cosmetics/personal care products. Furthermore, for large testing programs like REACH, in vivo carcinogenicity testing is impractical. Alternative ways to carcinogenicity assessment are urgently required. In terms of standardization and validation, the most advanced in vitro tests for carcinogenicity are the cell transformation assays (CTAs). Although CTAs do not mimic the whole carcinogenesis process in vivo, they represent a valuable support in identifying transforming potential of chemicals. CTAs have been shown to detect genotoxic as well as non-genotoxic carcinogens and are helpful in the determination of thresholds for genotoxic and non-genotoxic carcinogens. The extensive review on CTAs by the OECD (OECD (2007) Environmental Health and Safety Publications, Series on Testing and Assessment, No. 31) and the proven within- and between-laboratories reproducibility of the SHE CTAs justifies broader use of these methods to assess carcinogenic potential of chemicals. Copyright © 2012 Elsevier B.V. All rights reserved.
Composting in small laboratory pilots: Performance and reproducibility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.
2012-02-15
Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creatingmore » artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.« less
Lutke, W Kevin
2006-01-01
Petunia hybrida genetic transformation continues to be a valuable tool for genetic research into biochemical pathways and gene expression, as well as generating commercial products with varying floral colors. In this chapter, we describe a simple and reproducible genetic transformation protocol for generating transgenic petunia plants harboring a gene of interest and selectable marker. The system utilizes Agrobacterium tumefaciens for transgene integration with plant recovery via shoot organogenesis from leaf explant material. Selection for transgenic plants is achieved using the bar gene conferring resistance to glufosinate or nptII gene for resistance to kanamycin. Transformation efficiencies of around 10% are achievable with shoots being recovered about 8 wk after transgene insertion and rooted plants transferred to the greenhouse about twelve weeks after inoculation.
Campanile Near-Field Probes Fabricated by Nanoimprint Lithography on the Facet of an Optical Fiber
Calafiore, Giuseppe; Koshelev, Alexander; Darlington, Thomas P.; ...
2017-05-10
One of the major challenges to the widespread adoption of plasmonic and nano-optical devices in real-life applications is the difficulty to mass-fabricate nano-optical antennas in parallel and reproducible fashion, and the capability to precisely place nanoantennas into devices with nanometer-scale precision. In this study, we present a solution to this challenge using the state-of-the-art ultraviolet nanoimprint lithography (UV-NIL) to fabricate functional optical transformers onto the core of an optical fiber in a single step, mimicking the 'campanile' near-field probes. Imprinted probes were fabricated using a custom-built imprinter tool with co-axial alignment capability with sub < 100 nm position accuracy, followedmore » by a metallization step. Scanning electron micrographs confirm high imprint fidelity and precision with a thin residual layer to facilitate efficient optical coupling between the fiber and the imprinted optical transformer. The imprinted optical transformer probe was used in an actual NSOM measurement performing hyperspectral photoluminescence mapping of standard fluorescent beads. The calibration scans confirmed that imprinted probes enable sub-diffraction limited imaging with a spatial resolution consistent with the gap size. This novel nano-fabrication approach promises a low-cost, high-throughput, and reproducible manufacturing of advanced nano-optical devices.« less
Campanile Near-Field Probes Fabricated by Nanoimprint Lithography on the Facet of an Optical Fiber
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calafiore, Giuseppe; Koshelev, Alexander; Darlington, Thomas P.
One of the major challenges to the widespread adoption of plasmonic and nano-optical devices in real-life applications is the difficulty to mass-fabricate nano-optical antennas in parallel and reproducible fashion, and the capability to precisely place nanoantennas into devices with nanometer-scale precision. In this study, we present a solution to this challenge using the state-of-the-art ultraviolet nanoimprint lithography (UV-NIL) to fabricate functional optical transformers onto the core of an optical fiber in a single step, mimicking the 'campanile' near-field probes. Imprinted probes were fabricated using a custom-built imprinter tool with co-axial alignment capability with sub < 100 nm position accuracy, followedmore » by a metallization step. Scanning electron micrographs confirm high imprint fidelity and precision with a thin residual layer to facilitate efficient optical coupling between the fiber and the imprinted optical transformer. The imprinted optical transformer probe was used in an actual NSOM measurement performing hyperspectral photoluminescence mapping of standard fluorescent beads. The calibration scans confirmed that imprinted probes enable sub-diffraction limited imaging with a spatial resolution consistent with the gap size. This novel nano-fabrication approach promises a low-cost, high-throughput, and reproducible manufacturing of advanced nano-optical devices.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seino, Junji; Nakai, Hiromi, E-mail: nakai@waseda.jp; Research Institute for Science and Engineering, Waseda University, 3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555
In order to perform practical electron correlation calculations, the local unitary transformation (LUT) scheme at the spin-free infinite-order Douglas–Kroll–Hess (IODKH) level [J. Seino and H. Nakai, J. Chem. Phys.136, 244102 (2012); J. Seino and H. Nakai, J. Chem. Phys.137, 144101 (2012)], which is based on the locality of relativistic effects, has been combined with the linear-scaling divide-and-conquer (DC)-based Hartree–Fock (HF) and electron correlation methods, such as the second-order Møller–Plesset (MP2) and the coupled cluster theories with single and double excitations (CCSD). Numerical applications in hydrogen halide molecules, (HX){sub n} (X = F, Cl, Br, and I), coinage metal chain systems,more » M{sub n} (M = Cu and Ag), and platinum-terminated polyynediyl chain, trans,trans-((p-CH{sub 3}C{sub 6}H{sub 4}){sub 3}P){sub 2}(C{sub 6}H{sub 5})Pt(C≡C){sub 4}Pt(C{sub 6}H{sub 5})((p-CH{sub 3}C{sub 6}H{sub 4}){sub 3}P){sub 2}, clarified that the present methods, namely DC-HF, MP2, and CCSD with the LUT-IODKH Hamiltonian, reproduce the results obtained using conventional methods with small computational costs. The combination of both LUT and DC techniques could be the first approach that achieves overall quasi-linear-scaling with a small prefactor for relativistic electron correlation calculations.« less
The Potters of Mata Ortiz: Transforming a Tradition.
ERIC Educational Resources Information Center
Johnson, Mark M.
1999-01-01
Discusses the pottery revival in the village of Juan Mata Ortiz that is located in Chihuahua, Mexico. Contends that the revival was inspired by the efforts and experimentation of Juan Quezada who rediscovered the materials and techniques necessary to reproduce the Casas Grandes style polychrome pottery. (CMK)
2014-01-01
Background Peripheral quantitative computed tomography (pQCT) is an established technology that allows for the measurement of the material properties of bone. Alterations to bone architecture are associated with an increased risk of fracture. Further pQCT research is necessary to identify regions of interest that are prone to fracture risk in people with chronic diseases. The second metatarsal is a common site for the development of insufficiency fractures, and as such the aim of this study was to assess the reproducibility of a novel scanning protocol of the second metatarsal using pQCT. Methods Eleven embalmed cadaveric leg specimens were scanned six times; three times with and without repositioning. Each foot was positioned on a custom-designed acrylic foot plate to permit unimpeded scans of the region of interest. Sixty-six scans were obtained at 15% (distal) and 50% (mid shaft) of the second metatarsal. Voxel size and scan speed were reduced to 0.40 mm and 25 mm.sec-1. The reference line was positioned at the most distal portion of the 2nd metatarsal. Repeated measurements of six key variables related to bone properties were subject to reproducibility testing. Data were log transformed and reproducibility of scans were assessed using intraclass correlation coefficients (ICC) and coefficients of variation (CV%). Results Reproducibility of the measurements without repositioning were estimated as: trabecular area (ICC 0.95; CV% 2.4), trabecular density (ICC 0.98; CV% 3.0), Strength Strain Index (SSI) - distal (ICC 0.99; CV% 5.6), cortical area (ICC 1.0; CV% 1.5), cortical density (ICC 0.99; CV% 0.1), SSI – mid shaft (ICC 1.0; CV% 2.4). Reproducibility of the measurements after repositioning were estimated as: trabecular area (ICC 0.96; CV% 2.4), trabecular density (ICC 0.98; CV% 2.8), SSI - distal (ICC 1.0; CV% 3.5), cortical area (ICC 0.99; CV%2.4), cortical density (ICC 0.98; CV% 0.8), SSI – mid shaft (ICC 0.99; CV% 3.2). Conclusions The scanning protocol generated excellent reproducibility for key bone properties measured at the distal and mid-shaft regions of the 2nd metatarsal. This protocol extends the capabilities of pQCT to evaluate bone quality in people who may be at an increased risk of metatarsal insufficiency fractures. PMID:25037451
Focused ultrasound-mediated drug delivery through the blood-brain barrier
Burgess, Alison; Shah, Kairavi; Hough, Olivia; Hynynen, Kullervo
2015-01-01
Despite recent advances in blood-brain barrier (BBB) research, it remains a significant hurdle for the pharmaceutical treatment of brain diseases. Focused ultrasound (FUS) is one method to transiently increase permeability of the BBB to promote drug delivery to specific brain regions. An introduction to the BBB and a brief overview of the methods which can be used to circumvent the BBB to promote drug delivery is provided. In particular, we discuss the advantages and limitations of FUS technology and the efficacy of FUS-mediated drug delivery in models of disease. MRI for targeting and evaluating FUS treatments, combined with administration of microbubbles, allows for transient, reproducible BBB opening. The integration of a real-time acoustic feedback controller has improved treatment safety. Successful clinical translation of FUS has the potential to transform the treatment of brain disease worldwide without requiring the development of new pharmaceutical agents. PMID:25936845
Baba, Seiki; Hoshino, Takeshi; Ito, Len; Kumasaka, Takashi
2013-01-01
Protein crystals are fragile, and it is sometimes difficult to find conditions suitable for handling and cryocooling the crystals before conducting X-ray diffraction experiments. To overcome this issue, a protein crystal-mounting method has been developed that involves a water-soluble polymer and controlled humid air that can adjust the moisture content of a mounted crystal. By coating crystals with polymer glue and exposing them to controlled humid air, the crystals were stable at room temperature and were cryocooled under optimized humidity. Moreover, the glue-coated crystals reproducibly showed gradual transformations of their lattice constants in response to a change in humidity; thus, using this method, a series of isomorphous crystals can be prepared. This technique is valuable when working on fragile protein crystals, including membrane proteins, and will also be useful for multi-crystal data collection. PMID:23999307
Transforming Epidemiology for 21st Century Medicine and Public Health
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khoury, Muin J; Lam, Tram Kim; Ioannidis, John
2013-01-01
n 2012, the National Cancer Institute (NCI) engaged the scientific community to provide a vision for cancer epidemiology in the 21st century. Eight overarching thematic recommendations, with proposed corresponding actions for consideration by funding agencies, professional societies, and the research community emerged from the collective intellectual discourse. The themes are (i) extending the reach of epidemiology beyond discovery and etiologic research to include multilevel analysis, intervention evaluation, implementation, and outcomes research; (ii) transforming the practice of epidemiology by moving toward more access and sharing of protocols, data, metadata, and specimens to foster collaboration, to ensure reproducibility and replication, and acceleratemore » translation; (iii) expanding cohort studies to collect exposure, clinical, and other information across the life course and examining multiple health-related endpoints; (iv) developing and validating reliable methods and technologies to quantify exposures and outcomes on a massive scale, and to assess concomitantly the role of multiple factors in complex diseases; (v) integrating big data science into the practice of epidemiology; (vi) expanding knowledge integration to drive research, policy, and practice; (vii) transforming training of 21st century epidemiologists to address interdisciplinary and translational research; and (viii) optimizing the use of resources and infrastructure for epidemiologic studies. These recommendations can transform cancer epidemiology and the field of epidemiology, in general, by enhancing transparency, interdisciplinary collaboration, and strategic applications of new technologies. They should lay a strong scientific foundation for accelerated translation of scientific discoveries into individual and population health benefits.« less
Kuznetsova, G D; Gabova, A V; Lazarev, I E; Obukhov, Iu V; Obukhov, K Iu; Morozov, A A; Kulikov, M A; Shchatskova, A B; Vasil'eva, O N; Tomilovskaia, E S
2015-01-01
Frequency-temporal electroencephalogram (EEG) reactions to hypogravity were studied in 7 male subjects at the age of 20 to 27 years. The experiment was conducted using dry immersion (DI) as the best known method of simulating the space microgravity effects on the Earth. This hypogravity model reproduces hypokinesia, i.e. the weight-bearing and mechanic load removal, which is typical of microgravity. EEG was recorded by Neuroscan-2 (Compumedics) before the experiment (baseline data) and at the end of day 2 in DI. Comparative analysis of the EEG frequency-temporal structure was performed with the use of 2 techniques: Fourier transform and modified wavelet analysis. The Fourier transform elicited that after 2 days in DI the main shifts occurring to the EEG spectral composition are a decline in the alpha power and a slight though reliable growth of theta power. Similar frequency shifts were detected in the same records analyzed using the wavelet transform. According to wavelet analysis, during DI shifts in EEG frequency spectrum are accompanied by frequency desorganization of the EEG dominant rhythm and gross impairment of total stability of the electrical activity with time. Wavelet transform provides an opportunity to quantify changes in the frequency-temporal structure of the electrical activity of the brain. Quantitative evidence of frequency desorganization and temporal instability of EEG wavelet spectrograms may be the key to the understanding of mechanisms that drive functional disorders in the brain cortex in the conditions of hypogravity.
Transforming Epidemiology for 21st Century Medicine and Public Health
Khoury, Muin J.; Lam, Tram Kim; Ioannidis, John P.A.; Hartge, Patricia; Spitz, Margaret R.; Buring, Julie E.; Chanock, Stephen J.; Croyle, Robert T.; Goddard, Katrina A.; Ginsburg, Geoffrey S.; Herceg, Zdenko; Hiatt, Robert A.; Hoover, Robert N.; Hunter, David J.; Kramer, Barnet S.; Lauer, Michael S.; Meyerhardt, Jeffrey A.; Olopade, Olufunmilayo I.; Palmer, Julie R.; Sellers, Thomas A.; Seminara, Daniela; Ransohoff, David F.; Rebbeck, Timothy R.; Tourassi, Georgia; Winn, Deborah M.; Zauber, Ann; Schully, Sheri D.
2013-01-01
In 2012, the National Cancer Institute (NCI) engaged the scientific community to provide a vision for cancer epidemiology in the 21st century. Eight overarching thematic recommendations, with proposed corresponding actions for consideration by funding agencies, professional societies, and the research community emerged from the collective intellectual discourse. The themes are (i) extending the reach of epidemiology beyond discovery and etiologic research to include multilevel analysis, intervention evaluation, implementation, and outcomes research; (ii) transforming the practice of epidemiology by moving towards more access and sharing of protocols, data, metadata, and specimens to foster collaboration, to ensure reproducibility and replication, and accelerate translation; (iii) expanding cohort studies to collect exposure, clinical and other information across the life course and examining multiple health-related endpoints; (iv) developing and validating reliable methods and technologies to quantify exposures and outcomes on a massive scale, and to assess concomitantly the role of multiple factors in complex diseases; (v) integrating “big data” science into the practice of epidemiology; (vi) expanding knowledge integration to drive research, policy and practice; (vii) transforming training of 21st century epidemiologists to address interdisciplinary and translational research; and (viii) optimizing the use of resources and infrastructure for epidemiologic studies. These recommendations can transform cancer epidemiology and the field of epidemiology in general, by enhancing transparency, interdisciplinary collaboration, and strategic applications of new technologies. They should lay a strong scientific foundation for accelerated translation of scientific discoveries into individual and population health benefits. PMID:23462917
Carasso, Alfred S; Vladár, András E
2012-01-01
Helium ion microscopes (HIM) are capable of acquiring images with better than 1 nm resolution, and HIM images are particularly rich in morphological surface details. However, such images are generally quite noisy. A major challenge is to denoise these images while preserving delicate surface information. This paper presents a powerful slow motion denoising technique, based on solving linear fractional diffusion equations forward in time. The method is easily implemented computationally, using fast Fourier transform (FFT) algorithms. When applied to actual HIM images, the method is found to reproduce the essential surface morphology of the sample with high fidelity. In contrast, such highly sophisticated methodologies as Curvelet Transform denoising, and Total Variation denoising using split Bregman iterations, are found to eliminate vital fine scale information, along with the noise. Image Lipschitz exponents are a useful image metrology tool for quantifying the fine structure content in an image. In this paper, this tool is applied to rank order the above three distinct denoising approaches, in terms of their texture preserving properties. In several denoising experiments on actual HIM images, it was found that fractional diffusion smoothing performed noticeably better than split Bregman TV, which in turn, performed slightly better than Curvelet denoising.
Geometry and dynamics in the fractional discrete Fourier transform.
Wolf, Kurt Bernardo; Krötzsch, Guillermo
2007-03-01
The N x N Fourier matrix is one distinguished element within the group U(N) of all N x N unitary matrices. It has the geometric property of being a fourth root of unity and is close to the dynamics of harmonic oscillators. The dynamical correspondence is exact only in the N-->infinity contraction limit for the integral Fourier transform and its fractional powers. In the finite-N case, several options have been considered in the literature. We compare their fidelity in reproducing the classical harmonic motion of discrete coherent states.
Open sd-shell nuclei from first principles
Jansen, Gustav R.; Signoracci, Angelo J.; Hagen, Gaute; ...
2016-07-05
We extend the ab initio coupled-cluster effective interaction (CCEI) method to open-shell nuclei with protons and neutrons in the valence space, and compute binding energies and excited states of isotopes of neon and magnesium. We employ a nucleon-nucleon and three-nucleon interaction from chiral effective field theory evolved to a lower cutoff via a similarity renormalization group transformation. We find good agreement with experiment for binding energies and spectra, while charge radii of neon isotopes are underestimated. For the deformed nuclei 20Ne and 24Mg we reproduce rotational bands and electric quadrupole transitions within uncertainties estimated from an effective field theory formore » deformed nuclei, thereby demonstrating that collective phenomena in sd-shell nuclei emerge from complex ab initio calculations.« less
Open sd-shell nuclei from first principles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jansen, Gustav R.; Signoracci, Angelo J.; Hagen, Gaute
We extend the ab initio coupled-cluster effective interaction (CCEI) method to open-shell nuclei with protons and neutrons in the valence space, and compute binding energies and excited states of isotopes of neon and magnesium. We employ a nucleon-nucleon and three-nucleon interaction from chiral effective field theory evolved to a lower cutoff via a similarity renormalization group transformation. We find good agreement with experiment for binding energies and spectra, while charge radii of neon isotopes are underestimated. For the deformed nuclei 20Ne and 24Mg we reproduce rotational bands and electric quadrupole transitions within uncertainties estimated from an effective field theory formore » deformed nuclei, thereby demonstrating that collective phenomena in sd-shell nuclei emerge from complex ab initio calculations.« less
Repeatability and reproducibility of ribotyping and its computer interpretation.
Lefresne, Gwénola; Latrille, Eric; Irlinger, Françoise; Grimont, Patrick A D
2004-04-01
Many molecular typing methods are difficult to interpret because their repeatability (within-laboratory variance) and reproducibility (between-laboratory variance) have not been thoroughly studied. In the present work, ribotyping of coryneform bacteria was the basis of a study involving within-gel and between-gel repeatability and between-laboratory reproducibility (two laboratories involved). The effect of different technical protocols, different algorithms, and different software for fragment size determination was studied. Analysis of variance (ANOVA) showed, within a laboratory, that there was no significant added variance between gels. However, between-laboratory variance was significantly higher than within-laboratory variance. This may be due to the use of different protocols. An experimental function was calculated to transform the data and make them compatible (i.e., erase the between-laboratory variance). The use of different interpolation algorithms (spline, Schaffer and Sederoff) was a significant source of variation in one laboratory only. The use of either Taxotron (Institut Pasteur) or GelCompar (Applied Maths) was not a significant source of added variation when the same algorithm (spline) was used. However, the use of Bio-Gene (Vilber Lourmat) dramatically increased the error (within laboratory, within gel) in one laboratory, while decreasing the error in the other laboratory; this might be due to automatic normalization attempts. These results were taken into account for building a database and performing automatic pattern identification using Taxotron. Conversion of the data considerably improved the identification of patterns irrespective of the laboratory in which the data were obtained.
Guo, Qi; Shen, Shu-Ting
2016-04-29
There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.
Lamination effects on a 3D model of the magnetic core of power transformers
NASA Astrophysics Data System (ADS)
Poveda-Lerma, Antonio; Serrano-Callergues, Guillermo; Riera-Guasp, Martin; Pineda-Sanchez, Manuel; Puche-Panadero, Ruben; Perez-Cruz, Juan
2017-12-01
In this paper the lamination effect on the model of a power transformer's core with stacked E-I structure is analyzed. The distribution of the magnetic flux in the laminations depends on the stacking method. In this work it is shown, using a 3D FEM model and an experimental prototype, that the non-uniform distribution of the flux in a laminated E-I core with alternate-lap joint stack increases substantially the average value of the magnetic flux density in the core, compared with a butt joint stack. Both the simulated model and the experimental tests show that the presence of constructive air-gaps in the E-I junctions gives rise to a zig-zag flux in the depth direction. This inter-lamination flux reduces the magnetic flux density in the I-pieces and increases substantially the magnetic flux density in the E-pieces, with highly saturated points that traditional 2D analysis cannot reproduce. The relation between the number of laminations included in the model, and the computational resourses needed to build it, is also evaluated in this work.
Wójcicki, Tomasz; Nowicki, Michał
2016-01-01
The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389
Mission Possible: Transforming Women and Building Communities
ERIC Educational Resources Information Center
Flemming, Monica; Nelson, Barbara Mullins
2007-01-01
Although women make up over half of the U.S. workforce, they continue to encounter difficulties, including male domination as well as inequities in pay, benefits, and other rewards (Bierema, 2001). Billett (2002) suggests opportunities to advance are "subject to workplace practices that reproduce inequities through contested workplace relations."…
Rethinking Educational Purpose: The Socialist Challenge
ERIC Educational Resources Information Center
Malott, Curry
2012-01-01
In this essay Malott makes a case for a Marxist reading of education's role in expanding and reproducing capitalist societies. In the process he challenges the proposition that cognitive capitalism has fundamentally transformed the way in which capitalism operates. That is, rather than being guided by an internal capitalist logic, proponents of…
Zhou, Bin; Zhang, Zhendong; Wang, Ji; Yu, Y Eric; Liu, Xiaowei Sherry; Nishiyama, Kyle K; Rubin, Mishaela R; Shane, Elizabeth; Bilezikian, John P; Guo, X Edward
2016-06-01
Trabecular plate and rod microstructure plays a dominant role in the apparent mechanical properties of trabecular bone. With high-resolution computed tomography (CT) images, digital topological analysis (DTA) including skeletonization and topological classification was applied to transform the trabecular three-dimensional (3D) network into surface and curve skeletons. Using the DTA-based topological analysis and a new reconstruction/recovery scheme, individual trabecula segmentation (ITS) was developed to segment individual trabecular plates and rods and quantify the trabecular plate- and rod-related morphological parameters. High-resolution peripheral quantitative computed tomography (HR-pQCT) is an emerging in vivo imaging technique to visualize 3D bone microstructure. Based on HR-pQCT images, ITS was applied to various HR-pQCT datasets to examine trabecular plate- and rod-related microstructure and has demonstrated great potential in cross-sectional and longitudinal clinical applications. However, the reproducibility of ITS has not been fully determined. The aim of the current study is to quantify the precision errors of ITS plate-rod microstructural parameters. In addition, we utilized three different frequently used contour techniques to separate trabecular and cortical bone and to evaluate their effect on ITS measurements. Overall, good reproducibility was found for the standard HR-pQCT parameters with precision errors for volumetric BMD and bone size between 0.2%-2.0%, and trabecular bone microstructure between 4.9%-6.7% at the radius and tibia. High reproducibility was also achieved for ITS measurements using all three different contour techniques. For example, using automatic contour technology, low precision errors were found for plate and rod trabecular number (pTb.N, rTb.N, 0.9% and 3.6%), plate and rod trabecular thickness (pTb.Th, rTb.Th, 0.6% and 1.7%), plate trabecular surface (pTb.S, 3.4%), rod trabecular length (rTb.ℓ, 0.8%), and plate-plate junction density (P-P Junc.D, 2.3%) at the tibia. The precision errors at the radius were similar to those at the tibia. In addition, precision errors were affected by the contour technique. At the tibia, precision error by the manual contour method was significantly different from automatic and standard contour methods for pTb.N, rTb.N and rTb.Th. Precision error using the manual contour method was also significantly different from the standard contour method for rod trabecular number (rTb.N), rod trabecular thickness (rTb.Th), rod-rod and plate-rod junction densities (R-R Junc.D and P-R Junc.D) at the tibia. At the radius, the precision error was similar between the three different contour methods. Image quality was also found to significantly affect the ITS reproducibility. We concluded that ITS parameters are highly reproducible, giving assurance that future cross-sectional and longitudinal clinical HR-pQCT studies are feasible in the context of limited sample sizes.
Automated multi-day tracking of marked mice for the analysis of social behaviour.
Ohayon, Shay; Avni, Ofer; Taylor, Adam L; Perona, Pietro; Roian Egnor, S E
2013-09-30
A quantitative description of animal social behaviour is informative for behavioural biologists and clinicians developing drugs to treat social disorders. Social interaction in a group of animals has been difficult to measure because behaviour develops over long periods of time and requires tedious manual scoring, which is subjective and often non-reproducible. Computer-vision systems with the ability to measure complex social behaviour automatically would have a transformative impact on biology. Here, we present a method for tracking group-housed mice individually as they freely interact over multiple days. Each mouse is bleach-marked with a unique fur pattern. The patterns are automatically learned by the tracking software and used to infer identities. Trajectories are analysed to measure behaviour as it develops over days, beyond the range of acute experiments. We demonstrate how our system may be used to study the development of place preferences, associations and social relationships by tracking four mice continuously for five days. Our system enables accurate and reproducible characterisation of wild-type mouse social behaviour and paves the way for high-throughput long-term observation of the effects of genetic, pharmacological and environmental manipulations. Published by Elsevier B.V.
Spatial acoustic signal processing for immersive communication
NASA Astrophysics Data System (ADS)
Atkins, Joshua
Computing is rapidly becoming ubiquitous as users expect devices that can augment and interact naturally with the world around them. In these systems it is necessary to have an acoustic front-end that is able to capture and reproduce natural human communication. Whether the end point is a speech recognizer or another human listener, the reduction of noise, reverberation, and acoustic echoes are all necessary and complex challenges. The focus of this dissertation is to provide a general method for approaching these problems using spherical microphone and loudspeaker arrays.. In this work, a theory of capturing and reproducing three-dimensional acoustic fields is introduced from a signal processing perspective. In particular, the decomposition of the spatial part of the acoustic field into an orthogonal basis of spherical harmonics provides not only a general framework for analysis, but also many processing advantages. The spatial sampling error limits the upper frequency range with which a sound field can be accurately captured or reproduced. In broadband arrays, the cost and complexity of using multiple transducers is an issue. This work provides a flexible optimization method for determining the location of array elements to minimize the spatial aliasing error. The low frequency array processing ability is also limited by the SNR, mismatch, and placement error of transducers. To address this, a robust processing method is introduced and used to design a reproduction system for rendering over arbitrary loudspeaker arrays or binaurally over headphones. In addition to the beamforming problem, the multichannel acoustic echo cancellation (MCAEC) issue is also addressed. A MCAEC must adaptively estimate and track the constantly changing loudspeaker-room-microphone response to remove the sound field presented over the loudspeakers from that captured by the microphones. In the multichannel case, the system is overdetermined and many adaptive schemes fail to converge to the true impulse response. This forces the need to track both the near and far end room responses. A transform domain method that mitigates this problem is derived and implemented. Results with a real system using a 16-channel loudspeaker array and 32-channel microphone array are presented.
Ultralow-fatigue shape memory alloy films
NASA Astrophysics Data System (ADS)
Chluba, Christoph; Ge, Wenwei; Lima de Miranda, Rodrigo; Strobel, Julian; Kienle, Lorenz; Quandt, Eckhard; Wuttig, Manfred
2015-05-01
Functional shape memory alloys need to operate reversibly and repeatedly. Quantitative measures of reversibility include the relative volume change of the participating phases and compatibility matrices for twinning. But no similar argument is known for repeatability. This is especially crucial for many future applications, such as artificial heart valves or elastocaloric cooling, in which more than 10 million transformation cycles will be required. We report on the discovery of an ultralow-fatigue shape memory alloy film system based on TiNiCu that allows at least 10 million transformation cycles. We found that these films contain Ti2Cu precipitates embedded in the base alloy that serve as sentinels to ensure complete and reproducible transformation in the course of each memory cycle.
Data to knowledge: how to get meaning from your result
Berman, Helen M.; Gabanyi, Margaret J.; Groom, Colin R.; Johnson, John E.; Murshudov, Garib N.; Nicholls, Robert A.; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D.; Westbrook, John; Minor, Wladek
2015-01-01
Structural and functional studies require the development of sophisticated ‘Big Data’ technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB ‘super’ laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results. PMID:25610627
Droplet microfluidics for synthetic biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
Distinguishing Provenance Equivalence of Earth Science Data
NASA Technical Reports Server (NTRS)
Tilmes, Curt; Yesha, Ye; Halem, M.
2010-01-01
Reproducibility of scientific research relies on accurate and precise citation of data and the provenance of that data. Earth science data are often the result of applying complex data transformation and analysis workflows to vast quantities of data. Provenance information of data processing is used for a variety of purposes, including understanding the process and auditing as well as reproducibility. Certain provenance information is essential for producing scientifically equivalent data. Capturing and representing that provenance information and assigning identifiers suitable for precisely distinguishing data granules and datasets is needed for accurate comparisons. This paper discusses scientific equivalence and essential provenance for scientific reproducibility. We use the example of an operational earth science data processing system to illustrate the application of the technique of cascading digital signatures or hash chains to precisely identify sets of granules and as provenance equivalence identifiers to distinguish data made in an an equivalent manner.
Droplet microfluidics for synthetic biology
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee; ...
2017-08-10
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
NASA Astrophysics Data System (ADS)
Xue, Yuejun; Ge, Tiantian; Wang, Xuchen
2015-12-01
Radiocarbon (14C) measurement of dissolved organic carbon (DOC) is a very powerful tool to study the sources, transformation and cycling of carbon in the ocean. The technique, however, remains great challenges for complete and successful oxidation of sufficient DOC with low blanks for high precision carbon isotopic ratio analysis, largely due to the overwhelming proportion of salts and low DOC concentrations in the ocean. In this paper, we report an effective UV-Oxidation method for oxidizing DOC in natural waters for radiocarbon analysis by accelerator mass spectrometry (AMS). The UV-oxidation system and method show 95%±4% oxidation efficiency and high reproducibility for DOC in both river and seawater samples. The blanks associated with the method was also low (about 3 µg C) that is critical for 14C analysis. As a great advantage of the method, multiple water samples can be oxidized at the same time so it reduces the sample processing time substantially compared with other UV-oxidation method currently being used in other laboratories. We have used the system and method for 14C studies of DOC in rivers, estuaries, and oceanic environments and have received promise results.
Boamah, Sheila A; Tremblay, Paul
2018-05-01
The Multifactor Leadership Questionnaire (MLQ) is the most widely used instrument for assessing dimensions of leadership style; yet, most studies have failed to reproduce the original MLQ factor structure. The current study evaluates the dimensionality and nomological validity of Bass's transactional and transformational leadership model using the MLQ in a sample of registered nurses working in acute care hospitals in Canada. A combination of exploratory and confirmatory factor analyses were used to evaluate the hypothetical factor structure of the MLQ consisting of five transformational factors, and three transactional factors. Results suggest that the eight-factor solution displayed best fit indices; however, two transactional factors should be extracted due to high interscale correlations and lack of differential relationships with the two leadership variables. The findings support a scale refinement and the need for new theory concerning the five transformational leadership and contingent reward dimensions of the MLQ.
Wang, Xuemei; Wang, Huan; Huang, Pengfei; Ma, Xiaomin; Lu, Xiaoquan; Du, Xinzhen
2017-01-06
A superior solid-phase microextraction (SPME) fiber-coating material, three dimensional order mesoporous polymers with Ia-3d bicontinuous cubic structure (3D-OMPs) was in situ coated on a stainless steel wire by solvent evaporation induced self-assembly (EISA) and thermo-polymerization. Fourier-transform infrared spectrometry (FTIR), transmission electron microscopy (TEM), scanning electron microscopy (SEM), small-angel X-ray diffraction (SAXRD), N 2 adsorption-desorption transmission, and thermogravimetry analysis (TGA) were applied to the characterization of the synthesized 3D-OMPs coating. The performance and feasibility of the homemade fiber was evaluated through direct immersion (DI) SPME followed by high-performance liquid chromatography-UV detector (HPLC-UV) for the simultaneous extraction of seven chlorophenols in water samples. Under the optimum conditions, the prepared fiber exhibited excellent extraction properties as compared to three commercial fibers, the DI-SPME-HPLC-UV method showed low limits of detection (0.32-1.85μgL -1 ), wide linear ranges (5.0-1000μgL -1 ), and acceptable reproducibility (relative standard deviation, RSD<7.6% for one fiber, RSD<8.9% for fiber to fiber). Moreover, the method was further successfully applied to the analysis of seven CPs in real samples with good recoveries (80.5-99.5%) and satisfactory precisions (RSD<9.2%). It was confirmed that the proposed method has high sensitivity, outstanding selectivity and good reproducibility to the determination of trace CPs in the environmental water. Copyright © 2016 Elsevier B.V. All rights reserved.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
Eike, David M; Maginn, Edward J
2006-04-28
A method recently developed to rigorously determine solid-liquid equilibrium using a free-energy-based analysis has been extended to analyze multiatom molecular systems. This method is based on using a pseudosupercritical transformation path to reversibly transform between solid and liquid phases. Integration along this path yields the free energy difference at a single state point, which can then be used to determine the free energy difference as a function of temperature and therefore locate the coexistence temperature at a fixed pressure. The primary extension reported here is the introduction of an external potential field capable of inducing center of mass order along with secondary orientational order for molecules. The method is used to calculate the melting point of 1-H-1,2,4-triazole and benzene. Despite the fact that the triazole model gives accurate bulk densities for the liquid and crystal phases, it is found to do a poor job of reproducing the experimental crystal structure and heat of fusion. Consequently, it yields a melting point that is 100 K lower than the experimental value. On the other hand, the benzene model has been parametrized extensively to match a wide range of properties and yields a melting point that is only 20 K lower than the experimental value. Previous work in which a simple "direct heating" method was used actually found that the melting point of the benzene model was 50 K higher than the experimental value. This demonstrates the importance of using proper free energy methods to compute phase behavior. It also shows that the melting point is a very sensitive measure of force field quality that should be considered in parametrization efforts. The method described here provides a relatively simple approach for computing melting points of molecular systems.
The Insight ToolKit image registration framework
Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.
2014-01-01
Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849
How to Transform Teaching with Tablets
ERIC Educational Resources Information Center
Daccord, Tom; Reich, Justin
2015-01-01
Without a change in our technology integration strategies, there's no reason to expect that a new device will magically create new teaching practices. In some iPad classrooms, students are engaged in truly innovative work. On the whole, however, tablets are most often used to reproduce existing practices. To make the most of their investment in…
ERIC Educational Resources Information Center
Cabraal, Liyana M. C.
The behaviorist world view, influential in many social-science disciplines, is challenged by theories of action. With steady developments in nonbehaviorist thinking and related social-action conceptions, the study of school organizational structure can be transformed into a field centered about the dynamics of individuals' practical actions. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baba, Seiki; Hoshino, Takeshi; Ito, Len
A new crystal-mounting method has been developed that involves a combination of controlled humid air and polymer glue for crystal coating. This method is particularly useful when applied to fragile protein crystals that are known to be sensitive to subtle changes in their physicochemical environment. Protein crystals are fragile, and it is sometimes difficult to find conditions suitable for handling and cryocooling the crystals before conducting X-ray diffraction experiments. To overcome this issue, a protein crystal-mounting method has been developed that involves a water-soluble polymer and controlled humid air that can adjust the moisture content of a mounted crystal. Bymore » coating crystals with polymer glue and exposing them to controlled humid air, the crystals were stable at room temperature and were cryocooled under optimized humidity. Moreover, the glue-coated crystals reproducibly showed gradual transformations of their lattice constants in response to a change in humidity; thus, using this method, a series of isomorphous crystals can be prepared. This technique is valuable when working on fragile protein crystals, including membrane proteins, and will also be useful for multi-crystal data collection.« less
Hydrogen concentration analysis in clinopyroxene using proton-proton scattering analysis
NASA Astrophysics Data System (ADS)
Weis, Franz A.; Ros, Linus; Reichart, Patrick; Skogby, Henrik; Kristiansson, Per; Dollinger, Günther
2018-02-01
Traditional methods to measure water in nominally anhydrous minerals (NAMs) are, for example, Fourier transformed infrared (FTIR) spectroscopy or secondary ion mass spectrometry (SIMS). Both well-established methods provide a low detection limit as well as high spatial resolution yet may require elaborate sample orientation or destructive sample preparation. Here we analyze the water content in erupted volcanic clinopyroxene phenocrysts by proton-proton scattering and reproduce water contents measured by FTIR spectroscopy. We show that this technique provides significant advantages over other methods as it can provide a three-dimensional distribution of hydrogen within a crystal, making the identification of potential inclusions possible as well as elimination of surface contamination. The sample analysis is also independent of crystal structure and orientation and independent of matrix effects other than sample density. The results are used to validate the accuracy of wavenumber-dependent vs. mineral-specific molar absorption coefficients in FTIR spectroscopy. In addition, we present a new method for the sample preparation of very thin crystals suitable for proton-proton scattering analysis using relatively low accelerator potentials.
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael
2018-05-01
Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.
NASA Astrophysics Data System (ADS)
Scott, Jill R.; Tremblay, Paul L.
2002-03-01
Traditionally, mass spectrometry has relied on manipulating the sample target to provide scanning capabilities for laser desorption microprobes. This has been problematic for an internal source laser desorption Fourier transform mass spectrometer (LD-FTMS) because of the high magnetic field (7 Tesla) and geometric constraints of the superconducting magnet bore. To overcome these limitations, we have implemented a unique external laser scanning mechanism for an internal source LD-FTMS. This mechanism provides adjustable resolution enhancement so that the spatial resolution at the target is not limited to that of the stepper motors at the light source (˜5 μm/step). The spatial resolution is now limited by the practical optical diffraction limit of the final focusing lens. The scanning mechanism employs a virtual source that is wavelength independent up to the final focusing lens, which can be controlled remotely to account for focal length dependence on wavelength. A binary index provides an automatic alignment feature. The virtual source is located ˜9 ft from the sample; therefore, it is completely outside of the vacuum system and beyond the 50 G line of the fringing magnetic field. To eliminate reproducibility problems associated with vacuum pump vibrations, we have taken advantage of the magnetic field inherent to the FTMS to utilize Lenz's law for vibrational dampening. The LD-FTMS microprobe has exceptional reproducibility, which enables successive mapping sequences for depth-profiling studies.
Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams.
Valous, Nektarios A; Mendoza, Fernando; Sun, Da-Wen; Allen, Paul
2009-01-01
Due to the high variability and complex colour distribution in meats and meat products, the colour signal calibration of any computer vision system used for colour quality evaluations, represents an essential condition for objective and consistent analyses. This paper compares two methods for CIE colour characterization using a computer vision system (CVS) based on digital photography; namely the polynomial transform procedure and the transform proposed by the sRGB standard. Also, it presents a procedure for evaluating the colour appearance and presence of pores and fat-connective tissue on pre-sliced hams made from pork, turkey and chicken. Our results showed high precision, in colour matching, for device characterization when the polynomial transform was used to match the CIE tristimulus values in comparison with the sRGB standard approach as indicated by their ΔE(ab)(∗) values. The [3×20] polynomial transfer matrix yielded a modelling accuracy averaging below 2.2 ΔE(ab)(∗) units. Using the sRGB transform, high variability was appreciated among the computed ΔE(ab)(∗) (8.8±4.2). The calibrated laboratory CVS, implemented with a low-cost digital camera, exhibited reproducible colour signals in a wide range of colours capable of pinpointing regions-of-interest and allowed the extraction of quantitative information from the overall ham slice surface with high accuracy. The extracted colour and morphological features showed potential for characterizing the appearance of ham slice surfaces. CVS is a tool that can objectively specify colour and appearance properties of non-uniformly coloured commercial ham slices.
Time and space integrating acousto-optic folded spectrum processing for SETI
NASA Technical Reports Server (NTRS)
Wagner, K.; Psaltis, D.
1986-01-01
Time and space integrating folded spectrum techniques utilizing acousto-optic devices (AOD) as 1-D input transducers are investigated for a potential application as wideband, high resolution, large processing gain spectrum analyzers in the search for extra-terrestrial intelligence (SETI) program. The space integrating Fourier transform performed by a lens channels the coarse spectral components diffracted from an AOD onto an array of time integrating narrowband fine resolution spectrum analyzers. The pulsing action of a laser diode samples the interferometrically detected output, aliasing the fine resolution components to baseband, as required for the subsequent charge coupled devices (CCD) processing. The raster scan mechanism incorporated into the readout of the CCD detector array is used to unfold the 2-D transform, reproducing the desired high resolution Fourier transform of the input signal.
Integrating models that depend on variable data
NASA Astrophysics Data System (ADS)
Banks, A. T.; Hill, M. C.
2016-12-01
Models of human-Earth systems are often developed with the goal of predicting the behavior of one or more dependent variables from multiple independent variables, processes, and parameters. Often dependent variable values range over many orders of magnitude, which complicates evaluation of the fit of the dependent variable values to observations. Many metrics and optimization methods have been proposed to address dependent variable variability, with little consensus being achieved. In this work, we evaluate two such methods: log transformation (based on the dependent variable being log-normally distributed with a constant variance) and error-based weighting (based on a multi-normal distribution with variances that tend to increase as the dependent variable value increases). Error-based weighting has the advantage of encouraging model users to carefully consider data errors, such as measurement and epistemic errors, while log-transformations can be a black box for typical users. Placing the log-transformation into the statistical perspective of error-based weighting has not formerly been considered, to the best of our knowledge. To make the evaluation as clear and reproducible as possible, we use multiple linear regression (MLR). Simulations are conducted with MatLab. The example represents stream transport of nitrogen with up to eight independent variables. The single dependent variable in our example has values that range over 4 orders of magnitude. Results are applicable to any problem for which individual or multiple data types produce a large range of dependent variable values. For this problem, the log transformation produced good model fit, while some formulations of error-based weighting worked poorly. Results support previous suggestions fthat error-based weighting derived from a constant coefficient of variation overemphasizes low values and degrades model fit to high values. Applying larger weights to the high values is inconsistent with the log-transformation. Greater consistency is obtained by imposing smaller (by up to a factor of 1/35) weights on the smaller dependent-variable values. From an error-based perspective, the small weights are consistent with large standard deviations. This work considers the consequences of these two common ways of addressing variable data.
Wang, Guanghui; Wu, Wells W; Zeng, Weihua; Chou, Chung-Lin; Shen, Rong-Fong
2006-05-01
A critical step in protein biomarker discovery is the ability to contrast proteomes, a process referred generally as quantitative proteomics. While stable-isotope labeling (e.g., ICAT, 18O- or 15N-labeling, or AQUA) remains the core technology used in mass spectrometry-based proteomic quantification, increasing efforts have been directed to the label-free approach that relies on direct comparison of peptide peak areas between LC-MS runs. This latter approach is attractive to investigators for its simplicity as well as cost effectiveness. In the present study, the reproducibility and linearity of using a label-free approach to highly complex proteomes were evaluated. Various amounts of proteins from different proteomes were subjected to repeated LC-MS analyses using an ion trap or Fourier transform mass spectrometer. Highly reproducible data were obtained between replicated runs, as evidenced by nearly ideal Pearson's correlation coefficients (for ion's peak areas or retention time) and average peak area ratios. In general, more than 50% and nearly 90% of the peptide ion ratios deviated less than 10% and 20%, respectively, from the average in duplicate runs. In addition, the multiplicity ratios of the amounts of proteins used correlated nicely with the observed averaged ratios of peak areas calculated from detected peptides. Furthermore, the removal of abundant proteins from the samples led to an improvement in reproducibility and linearity. A computer program has been written to automate the processing of data sets from experiments with groups of multiple samples for statistical analysis. Algorithms for outlier-resistant mean estimation and for adjusting statistical significance threshold in multiplicity of testing were incorporated to minimize the rate of false positives. The program was applied to quantify changes in proteomes of parental and p53-deficient HCT-116 human cells and found to yield reproducible results. Overall, this study demonstrates an alternative approach that allows global quantification of differentially expressed proteins in complex proteomes. The utility of this method to biomarker discovery is likely to synergize with future improvements in the detecting sensitivity of mass spectrometers.
de Groot, Marius; Vernooij, Meike W; Klein, Stefan; Ikram, M Arfan; Vos, Frans M; Smith, Stephen M; Niessen, Wiro J; Andersson, Jesper L R
2013-08-01
Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS establishes spatial correspondence using a combination of nonlinear registration and a "skeleton projection" that may break topological consistency of the transformed brain images. We therefore investigated feasibility of replacing the two-stage registration-projection procedure in TBSS with a single, regularized, high-dimensional registration. To optimize registration parameters and to evaluate registration performance in diffusion MRI, we designed an evaluation framework that uses native space probabilistic tractography for 23 white matter tracts, and quantifies tract similarity across subjects in standard space. We optimized parameters for two registration algorithms on two diffusion datasets of different quality. We investigated reproducibility of the evaluation framework, and of the optimized registration algorithms. Next, we compared registration performance of the regularized registration methods and TBSS. Finally, feasibility and effect of incorporating the improved registration in TBSS were evaluated in an example study. The evaluation framework was highly reproducible for both algorithms (R(2) 0.993; 0.931). The optimal registration parameters depended on the quality of the dataset in a graded and predictable manner. At optimal parameters, both algorithms outperformed the registration of TBSS, showing feasibility of adopting such approaches in TBSS. This was further confirmed in the example experiment. Copyright © 2013 Elsevier Inc. All rights reserved.
Electrochemical Solution Growth of Magnetic Nitrides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monson, Todd C.; Pearce, Charles
Magnetic nitrides, if manufactured in bulk form, would provide designers of transformers and inductors with a new class of better performing and affordable soft magnetic materials. According to experimental results from thin films and/or theoretical calculations, magnetic nitrides would have magnetic moments well in excess of current state of the art soft magnets. Furthermore, magnetic nitrides would have higher resistivities than current transformer core materials and therefore not require the use of laminates of inactive material to limit eddy current losses. However, almost all of the magnetic nitrides have been elusive except in difficult to reproduce thin films or asmore » inclusions in another material. Now, through its ability to reduce atmospheric nitrogen, the electrochemical solution growth (ESG) technique can bring highly sought after (and previously inaccessible) new magnetic nitrides into existence in bulk form. This method utilizes a molten salt as a solvent to solubilize metal cations and nitrogen ions produced electrochemically and form nitrogen compounds. Unlike other growth methods, the scalable ESG process can sustain high growth rates (~mm/hr) even under reasonable operating conditions (atmospheric pressure and 500 °C). Ultimately, this translates into a high throughput, low cost, manufacturing process. The ESG process has already been used successfully to grow high quality GaN. Below, the experimental results of an exploratory express LDRD project to access the viability of the ESG technique to grow magnetic nitrides will be presented.« less
Muthu, S; Prasath, M; Paulraj, E Isac; Balaji, R Arun
2014-01-01
The Fourier Transform infrared and Fourier Transform Raman spectra of 7-chloro-5 (2-chlorophenyl)-3-hydroxy-2,3-dihydro-1H-1,4-benzodiazepin-2-one (7C3D4B) were recorded in the regions 4000-400 and 4000-100 cm(-1), respectively. The appropriate theoretical spectrograms for the IR and Raman spectra of the title molecule were also constructed. The calculated results show that the predicted geometry can well reproduce the structural parameters. Predicted vibrational frequencies have been assigned and compared with experimental IR spectra and they supported each other. Stability of the molecule arising from hyperconjugative interactions, charge delocalization and intramolecular hydrogen bond-like weak interaction has been analyzed using natural bond orbital (NBO) analysis by using B3LYP/6-31G(d,p) method. The results show that electron density (ED) in the σ* and π* antibonding orbitals and second-order delocalization energies E(2) confirm the occurrence of intramolecular charge transfer (ICT) within the molecule. The first order hyperpolarizability (βtotal) of this molecular system and related properties (β, μ, and Δα) are calculated using HF/6-31G(d,p) and B3LYP/6-31G(d,p) methods based on the finite-field approach. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Turki, Imen; Laignel, Benoit; Kakeh, Nabil; Chevalier, Laetitia; Costa, Stephane
2015-04-01
This research is carried out in the framework of the program Surface Water and Ocean Topography (SWOT) which is a partnership between NASA and CNES. Here, a new hybrid model is implemented for filling gaps and forecasting the hourly sea level variability by combining classical harmonic analyses to high statistical methods to reproduce the deterministic and stochastic processes, respectively. After simulating the mean trend sea level and astronomical tides, the nontidal residual surges are investigated using an autoregressive moving average (ARMA) methods by two ways: (1) applying a purely statistical approach and (2) introducing the SLP in ARMA as a main physical process driving the residual sea level. The new hybrid model is applied to the western Atlantic sea and the eastern English Channel. Using ARMA model and considering the SLP, results show that the hourly sea level observations of gauges with are well reproduced with a root mean square error (RMSE) ranging between 4.5 and 7 cm for 1 to 30 days of gaps and an explained variance more than 80 %. For larger gaps of months, the RMSE reaches 9 cm. The negative and the positive extreme values of sea levels are also well reproduced with a mean explained variance between 70 and 85 %. The statistical behavior of 1-year modeled residual components shows good agreements with observations. The frequency analysis using the discrete wavelet transform illustrate strong correlations between observed and modeled energy spectrum and the bands of variability. Accordingly, the proposed model presents a coherent, simple, and easy tool to estimate the total sea level at timescales from days to months. The ARMA model seems to be more promising for filling gaps and estimating the sea level at larger scales of years by introducing more physical processes driving its stochastic variability.
Where next for the reproducibility agenda in computational biology?
Lewis, Joanna; Breeze, Charles E; Charlesworth, Jane; Maclaren, Oliver J; Cooper, Jonathan
2016-07-15
The concept of reproducibility is a foundation of the scientific method. With the arrival of fast and powerful computers over the last few decades, there has been an explosion of results based on complex computational analyses and simulations. The reproducibility of these results has been addressed mainly in terms of exact replicability or numerical equivalence, ignoring the wider issue of the reproducibility of conclusions through equivalent, extended or alternative methods. We use case studies from our own research experience to illustrate how concepts of reproducibility might be applied in computational biology. Several fields have developed 'minimum information' checklists to support the full reporting of computational simulations, analyses and results, and standardised data formats and model description languages can facilitate the use of multiple systems to address the same research question. We note the importance of defining the key features of a result to be reproduced, and the expected agreement between original and subsequent results. Dynamic, updatable tools for publishing methods and results are becoming increasingly common, but sometimes come at the cost of clear communication. In general, the reproducibility of computational research is improving but would benefit from additional resources and incentives. We conclude with a series of linked recommendations for improving reproducibility in computational biology through communication, policy, education and research practice. More reproducible research will lead to higher quality conclusions, deeper understanding and more valuable knowledge.
NASA Astrophysics Data System (ADS)
Nikitin, A. V.; Krishna, B. M.; Rey, M.; Tashkun, S. A.; Tyuterev, Vl. G.
2017-09-01
Table 4 of Ref [1] did not contain enough digits to reproduce the fitted Q(T) values for practical applications. The corrected table is given below. This does not affect other Tables and Figures or the conclusions of [1].
ERIC Educational Resources Information Center
Han, Sophia; Blank, Jolyn; Berson, Ilene R.
2017-01-01
The purpose of this research is to examine how and to what extent preservice teachers (PSTs) who engage in teacher inquiry develop a critical reflective stance towards their teaching practices in early childhood classrooms. As we, a team of early childhood teacher educators, incorporated teacher inquiry into our reformed early childhood teacher…
ERIC Educational Resources Information Center
Chavarria, Karina
2017-01-01
Social reproduction scholars and the literature on critical race theory and student resistance contend that schools are not neutral institutions existing in a vacuum free of the political and social struggles for rights and resources (Delgado Bernal, 1998; Fine, 1991). Instead, schools can be institutions that reproduce dominant ideologies and…
Measurement of limb volume: laser scanning versus volume displacement.
McKinnon, John Gregory; Wong, Vanessa; Temple, Walley J; Galbraith, Callum; Ferry, Paul; Clynch, George S; Clynch, Colin
2007-10-01
Determining the prevalence and treatment success of surgical lymphedema requires accurate and reproducible measurement. A new method of measurement of limb volume is described. A series of inanimate objects of known and unknown volume was measured using digital laser scanning and water displacement. A similar comparison was made with 10 human volunteers. Digital scanning was evaluated by comparison to the established method of water displacement, then to itself to determine reproducibility of measurement. (1) Objects of known volume: Laser scanning accurately measured the calculated volume but water displacement became less accurate as the size of the object increased. (2) Objects of unknown volume: As average volume increased, there was an increasing bias of underestimation of volume by the water displacement method. The coefficient of reproducibility of water displacement was 83.44 ml. In contrast, the reproducibility of the digital scanning method was 19.0 ml. (3) Human data: The mean difference between water displacement volume and laser scanning volume was 151.7 ml (SD +/- 189.5). The coefficient of reproducibility of water displacement was 450.8 ml whereas for laser scanning it was 174 ml. Laser scanning is an innovative method of measuring tissue volume that combines precision and reproducibility and may have clinical utility for measuring lymphedema. 2007 Wiley-Liss, Inc
Reproducibility measurements of three methods for calculating in vivo MR-based knee kinematics.
Lansdown, Drew A; Zaid, Musa; Pedoia, Valentina; Subburaj, Karupppasamy; Souza, Richard; Benjamin, C; Li, Xiaojuan
2015-08-01
To describe three quantification methods for magnetic resonance imaging (MRI)-based knee kinematic evaluation and to report on the reproducibility of these algorithms. T2 -weighted, fast-spin echo images were obtained of the bilateral knees in six healthy volunteers. Scans were repeated for each knee after repositioning to evaluate protocol reproducibility. Semiautomatic segmentation defined regions of interest for the tibia and femur. The posterior femoral condyles and diaphyseal axes were defined using the previously defined tibia and femur. All segmentation was performed twice to evaluate segmentation reliability. Anterior tibial translation (ATT) and internal tibial rotation (ITR) were calculated using three methods: a tibial-based registration system, a combined tibiofemoral-based registration method with all manual segmentation, and a combined tibiofemoral-based registration method with automatic definition of condyles and axes. Intraclass correlation coefficients and standard deviations across multiple measures were determined. Reproducibility of segmentation was excellent (ATT = 0.98; ITR = 0.99) for both combined methods. ATT and ITR measurements were also reproducible across multiple scans in the combined registration measurements with manual (ATT = 0.94; ITR = 0.94) or automatic (ATT = 0.95; ITR = 0.94) condyles and axes. The combined tibiofemoral registration with automatic definition of the posterior femoral condyle and diaphyseal axes allows for improved knee kinematics quantification with excellent in vivo reproducibility. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Wong, Jessina; Jahn, David A.; Giovambattista, Nicolas
2015-08-01
We study the pressure-induced transformations between low-density amorphous (LDA) and high-density amorphous (HDA) ice by performing out-of-equilibrium molecular dynamics (MD) simulations. We employ the TIP4P/2005 water model and show that this model reproduces qualitatively the LDA-HDA transformations observed experimentally. Specifically, the TIP4P/2005 model reproduces remarkably well the (i) structure (OO, OH, and HH radial distribution functions) and (ii) densities of LDA and HDA at P = 0.1 MPa and T = 80 K, as well as (iii) the qualitative behavior of ρ(P) during compression-induced LDA-to-HDA and decompression-induced HDA-to-LDA transformations. At the rates explored, the HDA-to-LDA transformation is less pronounced than in experiments. By studying the LDA-HDA transformations for a broad range of compression/decompression temperatures, we construct a "P-T phase diagram" for glassy water that is consistent with experiments and remarkably similar to that reported previously for ST2 water. This phase diagram is not inconsistent with the possibility of TIP4P/2005 water exhibiting a liquid-liquid phase transition at low temperatures. A comparison with previous MD simulation studies of SPC/E and ST2 water as well as experiments indicates that, overall, the TIP4P/2005 model performs better than the SPC/E and ST2 models. The effects of cooling and compression rates as well as aging on our MD simulations results are also discussed. The MD results are qualitatively robust under variations of cooling/compression rates (accessible in simulations) and are not affected by aging the hyperquenched glass for at least 1 μs. A byproduct of this work is the calculation of TIP4P/2005 water's diffusion coefficient D(T) at P = 0.1 MPa. It is found that, for T ≥ 210 K, D(T) ≈ (T - TMCT)-γ as predicted by mode coupling theory and in agreement with experiments. For TIP4P/2005 water, TMCT = 209 K and γ = 2.14, very close to the corresponding experimental values TMCT = 221 K and γ = 2.2.
Wong, Jessina; Jahn, David A; Giovambattista, Nicolas
2015-08-21
We study the pressure-induced transformations between low-density amorphous (LDA) and high-density amorphous (HDA) ice by performing out-of-equilibrium molecular dynamics (MD) simulations. We employ the TIP4P/2005 water model and show that this model reproduces qualitatively the LDA-HDA transformations observed experimentally. Specifically, the TIP4P/2005 model reproduces remarkably well the (i) structure (OO, OH, and HH radial distribution functions) and (ii) densities of LDA and HDA at P = 0.1 MPa and T = 80 K, as well as (iii) the qualitative behavior of ρ(P) during compression-induced LDA-to-HDA and decompression-induced HDA-to-LDA transformations. At the rates explored, the HDA-to-LDA transformation is less pronounced than in experiments. By studying the LDA-HDA transformations for a broad range of compression/decompression temperatures, we construct a "P-T phase diagram" for glassy water that is consistent with experiments and remarkably similar to that reported previously for ST2 water. This phase diagram is not inconsistent with the possibility of TIP4P/2005 water exhibiting a liquid-liquid phase transition at low temperatures. A comparison with previous MD simulation studies of SPC/E and ST2 water as well as experiments indicates that, overall, the TIP4P/2005 model performs better than the SPC/E and ST2 models. The effects of cooling and compression rates as well as aging on our MD simulations results are also discussed. The MD results are qualitatively robust under variations of cooling/compression rates (accessible in simulations) and are not affected by aging the hyperquenched glass for at least 1 μs. A byproduct of this work is the calculation of TIP4P/2005 water's diffusion coefficient D(T) at P = 0.1 MPa. It is found that, for T ≥ 210 K, D(T) ≈ (T - T(MCT))(-γ) as predicted by mode coupling theory and in agreement with experiments. For TIP4P/2005 water, T(MCT) = 209 K and γ = 2.14, very close to the corresponding experimental values T(MCT) = 221 K and γ = 2.2.
Atomistic Modeling of Diffusion and Phase Transformations in Metals and Alloys
NASA Astrophysics Data System (ADS)
Purja Pun, Ganga Prasad
Dissertation consists of multiple works. The first part is devoted to self-diffusion along dislocation cores in aluminum followed by the development of embedded atom method potentials for Co, NiAl, CoAl and CoNi systems. The last part focuses on martensitic phase transformation (MPT) in Ni xAl1--x and Al xCoyNi1-- x--y alloys. New calculation methods were developed to predict diffusion coefficients in metal as functions of temperature. Self-diffusion along screw and edge dislocations in aluminum was studied by molecular dynamic (MD) simulations. Three types of simulations were performed with and without (intrinsic) pre-existing vacancies and interstitials in the dislocation core. We found that the diffusion along the screw dislocation was dominated by the intrinsic mechanism, whereas the diffusion along the edge dislocation was dominated by the vacancy mechanism. The diffusion along the screw dislocation was found to be significantly faster than the diffusion along the edge dislocation, and the both diffusivities were in reasonable agreement with experimental data. The intrinsic diffusion mechanism can be associated with the formation of dynamic Frenkel pairs, possibly activated by thermal jogs and/or kinks. The simulations show that at high temperatures the dislocation core becomes an effective source/sink of point defects and the effect of pre-existing defects on the core diffusivity diminishes. First and the foremost ingredient needed in all atomistic computer simulations is the description of interaction between atoms. Interatomic potentials for Co, NiAl, CoAl and CoNi systems were developed within the Embedded Atom Method (EAM) formalism. The binary potentials were based on previously developed accurate potentials for pure Ni and pure Al and pure Co developed in this work. The binaries constitute a version of EAM potential of AlCoNi ternary system. The NiAl potential accurately reproduces a variety of physical properties of the B2-NiAl and L12--Ni3Al phases. The potential is expected to be especially suitable for simulations of hetero-phase interfaces and mechanical behavior of NiAl alloys. Apart from properties of the HCP Co, the new Co potential is accurate enough to reproduce several properties of the FCC Co which were not included in the potential fit. It shows good transferability property. The CoAl potential was fitted to the properties of B2-CoAl phase as in the NiAl fitting where as the NiCo potential was fitted to the ab initio formation energies of some imaginary phases and structures. Effect of chemical composition and uniaxial mechanical stresses was studied on the martensitic phase transformation in B2 type Ni-rich NiAl and AlCoNi alloys. The martensitic phase has a tetragonal crystal structure and can contain multiple twins arranged in domains and plates. The twinned martensites were always formed under the uniaxial compression where as the single variant martensites were the results of the uniaxial tension. The transformation was reversible and characterized by a significant temperature hysteresis. The magnitude of the hysteresis depends on the chemical composition and stress.
Neuhaus, Sonja; Padeste, Celestino; Spencer, Nicholas D
2011-06-07
A method to create a wettability gradient by variation of the chemical functionality in a polymer brush is presented. A poly(N-methyl-vinylpyridinium) (QP4VP) brush was created on a poly(ethylene-alt-tetrafluoroethylene) (ETFE) foil by the grafting of 4-vinylpyridine and subsequent quaternization. The instability of QP4VP, a strong polyelectrolyte, in alkaline media was exploited to transform it to the neutral poly(vinyl(N-methyl-2-pyridone)) (PVMP), as confirmed with ATR-IR spectroscopy. The slow transformation resulted in a substantial, time-dependent decrease in wettability. A nearly linear gradient in water contact angle (CA) was created by immersion of a QP4VP brush modified sample into a sodium hydroxide solution, resulting in CAs ranging from 10° to 60°. The concurrent decrease in the number of charged functional groups along the gradient was characterized by loading an anionic dye into the polymer brush and measuring the UV transmittance of the sample. The versatility of the wettability gradient was demonstrated by exchanging the counterions of the N-methyl-vinylpyridinium groups, whereby a reversal of gradient direction was reproducibly achieved.
Lanfer, A; Hebestreit, A; Ahrens, W; Krogh, V; Sieri, S; Lissner, L; Eiben, G; Siani, A; Huybrechts, I; Loit, H-M; Papoutsou, S; Kovács, E; Pala, V
2011-04-01
To investigate the reproducibility of food consumption frequencies derived from the food frequency section of the Children's Eating Habits Questionnaire (CEHQ-FFQ) that was developed and used in the IDEFICS (Identification and prevention of dietary- and lifestyle-induced health effects in children and infants) project to assess food habits in 2- to 9-year-old European children. From a subsample of 258 children who participated in the IDEFICS baseline examination, parental questionnaires of the CEHQ were collected twice to assess reproducibility of questionnaire results from 0 to 354 days after the first examination. Weighted Cohen's kappa coefficients (κ) and Spearman's correlation coefficients (r) were calculated to assess agreement between the first and second questionnaires for each food item of the CEHQ-FFQ. Stratification was performed for sex, age group, geographical region and length of period between the first and second administrations. Fisher's Z transformation was applied to test correlation coefficients for significant differences between strata. For all food items analysed, weighted Cohen's kappa coefficients (κ) and Spearman's correlation coefficients (r) were significant and positive (P<0.001). Reproducibility was lowest for diet soft drinks (κ=0.23, r=0.32) and highest for sweetened milk (κ=0.68, r=0.76). Correlation coefficients were comparable to those of previous studies on FFQ reproducibility in children and adults. Stratification did not reveal systematic differences in reproducibility by sex and age group. Spearman's correlation coefficients differed significantly between northern and southern European countries for 10 food items. In nine of them, the lower respective coefficient was still high enough to conclude acceptable reproducibility. As expected, longer time (>128 days) between the first and second administrations resulted in a generally lower, yet still acceptable, reproducibility. Results indicate that the CEHQ-FFQ gives reproducible estimates of the consumption frequency of 43 food items from 14 food groups in European children.
Aveiro method in reproducing kernel Hilbert spaces under complete dictionary
NASA Astrophysics Data System (ADS)
Mai, Weixiong; Qian, Tao
2017-12-01
Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.
Zheng, Yongping; Zhang, Tingwei; Wu, Songjie; Zhang, Jue; Fang, Jing
2018-01-01
Molecularly imprinted polymer (MIP) films prepared by bulk polymerization suffer from numerous deficiencies, including poor mass transfer ability and difficulty in controlling reaction rate and film thickness, which usually result in poor repeatability. However, polymer film synthesized by electropolymerization methods benefit from high reproducibility, simplicity and rapidity of preparation. In the present study, an Au film served as the refractive index-sensitive metal film to couple with the light leaked out from optical fiber core and the electrode for electropolymerizing MIP film simultaneously. The manufactured probe exhibited satisfactory sensitivity and specificity. Furthermore, the surface morphology and functional groups of the synthesized MIP film were characterized by Atomic Force Microscopy (AFM) and Fourier transform infrared microspectroscopy (FTIR) for further insights into the adsorption and desorption processes. Given the low cost, label-free test, simple preparation process and fast response, this method has a potential application to monitor substances in complicated real samples for out-of-lab test in the future. PMID:29522472
Comparison of parameter-adapted segmentation methods for fluorescence micrographs.
Held, Christian; Palmisano, Ralf; Häberle, Lothar; Hensel, Michael; Wittenberg, Thomas
2011-11-01
Interpreting images from fluorescence microscopy is often a time-consuming task with poor reproducibility. Various image processing routines that can help investigators evaluate the images are therefore useful. The critical aspect for a reliable automatic image analysis system is a robust segmentation algorithm that can perform accurate segmentation for different cell types. In this study, several image segmentation methods were therefore compared and evaluated in order to identify the most appropriate segmentation schemes that are usable with little new parameterization and robustly with different types of fluorescence-stained cells for various biological and biomedical tasks. The study investigated, compared, and enhanced four different methods for segmentation of cultured epithelial cells. The maximum-intensity linking (MIL) method, an improved MIL, a watershed method, and an improved watershed method based on morphological reconstruction were used. Three manually annotated datasets consisting of 261, 817, and 1,333 HeLa or L929 cells were used to compare the different algorithms. The comparisons and evaluations showed that the segmentation performance of methods based on the watershed transform was significantly superior to the performance of the MIL method. The results also indicate that using morphological opening by reconstruction can improve the segmentation of cells stained with a marker that exhibits the dotted surface of cells. Copyright © 2011 International Society for Advancement of Cytometry.
Mulder, Emma R; de Jong, Remko A; Knol, Dirk L; van Schijndel, Ronald A; Cover, Keith S; Visser, Pieter J; Barkhof, Frederik; Vrenken, Hugo
2014-05-15
To measure hippocampal volume change in Alzheimer's disease (AD) or mild cognitive impairment (MCI), expert manual delineation is often used because of its supposed accuracy. It has been suggested that expert outlining yields poorer reproducibility as compared to automated methods, but this has not been investigated. To determine the reproducibilities of expert manual outlining and two common automated methods for measuring hippocampal atrophy rates in healthy aging, MCI and AD. From the Alzheimer's Disease Neuroimaging Initiative (ADNI), 80 subjects were selected: 20 patients with AD, 40 patients with mild cognitive impairment (MCI) and 20 healthy controls (HCs). Left and right hippocampal volume change between baseline and month-12 visit was assessed by using expert manual delineation, and by the automated software packages FreeSurfer (longitudinal processing stream) and FIRST. To assess reproducibility of the measured hippocampal volume change, both back-to-back (BTB) MPRAGE scans available for each visit were analyzed. Hippocampal volume change was expressed in μL, and as a percentage of baseline volume. Reproducibility of the 1-year hippocampal volume change was estimated from the BTB measurements by using linear mixed model to calculate the limits of agreement (LoA) of each method, reflecting its measurement uncertainty. Using the delta method, approximate p-values were calculated for the pairwise comparisons between methods. Statistical analyses were performed both with inclusion and exclusion of visibly incorrect segmentations. Visibly incorrect automated segmentation in either one or both scans of a longitudinal scan pair occurred in 7.5% of the hippocampi for FreeSurfer and in 6.9% of the hippocampi for FIRST. After excluding these failed cases, reproducibility analysis for 1-year percentage volume change yielded LoA of ±7.2% for FreeSurfer, ±9.7% for expert manual delineation, and ±10.0% for FIRST. Methods ranked the same for reproducibility of 1-year μL volume change, with LoA of ±218 μL for FreeSurfer, ±319 μL for expert manual delineation, and ±333 μL for FIRST. Approximate p-values indicated that reproducibility was better for FreeSurfer than for manual or FIRST, and that manual and FIRST did not differ. Inclusion of failed automated segmentations led to worsening of reproducibility of both automated methods for 1-year raw and percentage volume change. Quantitative reproducibility values of 1-year microliter and percentage hippocampal volume change were roughly similar between expert manual outlining, FIRST and FreeSurfer, but FreeSurfer reproducibility was statistically significantly superior to both manual outlining and FIRST after exclusion of failed segmentations. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Seko, Hiromu; Kunii, Masaru; Yokota, Sho; Tsuyuki, Tadashi; Miyoshi, Takemasa
2015-12-01
Experiments simulating intense vortices associated with tornadoes that occurred on 6 May 2012 on the Kanto Plain, Japan, were performed with a nested local ensemble transform Kalman filter (LETKF) system. Intense vortices were reproduced by downscale experiments with a 12-member ensemble in which the initial conditions were obtained from the nested LETKF system analyses. The downscale experiments successfully generated intense vortices in three regions similar to the observed vortices, whereas only one tornado was reproduced by a deterministic forecast. The intense vorticity of the strongest tornado, which was observed in the southernmost region, was successfully reproduced by 10 of the 12 ensemble members. An examination of the results of the ensemble downscale experiments showed that the duration of intense vorticities tended to be longer when the vertical shear of the horizontal wind was larger and the lower airflow was more humid. Overall, the study results show that ensemble forecasts have the following merits: (1) probabilistic forecasts of the outbreak of intense vortices associated with tornadoes are possible; (2) the miss rate of outbreaks should decrease; and (3) environmental factors favoring outbreaks can be obtained by comparing the multiple possible scenarios of the ensemble forecasts.
NASA Astrophysics Data System (ADS)
Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui
2016-07-01
Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.
Seiber, J N; Glotfelty, D E; Lucas, A D; McChesney, M M; Sagebiel, J C; Wehner, T A
1990-01-01
A multiresidue analytical method is described for pesticides, transformation products, and related toxicants based upon high performance liquid chromatographic (HPLC) fractionation of extracted residue on a Partisil silica gel normal phase column followed by selective-detector gas chromatographic (GC) determination of components in each fraction. The HPLC mobile phase gradient (hexane to methyl t-butyl ether) gave good chromatographic efficiency, resolution, reproducibility and recovery for 61 test compounds, and allowed for collection in four fractions spanning polarities from low polarity organochlorine compounds (fraction 1) to polar N-methylcarbamates and organophosphorus oxons (fraction 4). The multiresidue method was developed for use with air samples collected on XAD-4 and related trapping agents, and water samples extracted with methylene chloride. Detection limits estimated from spiking experiments were generally 0.3-1 ng/m3 for high-volume air samples, and 0.01-0.1 microgram/L for one-liter water samples. Applications were made to determination of pesticides in fogwater and air samples.
Force Density Function Relationships in 2-D Granular Media
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.
2004-01-01
An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Helmut; Reisbach, Gilbert; Schultheiss, Ute
The latent membrane protein 1 (LMP1) of Epstein-Barr virus (EBV) transforms cells activating signal transduction pathways such as NF-{kappa}B, PI3-kinase, or c-Jun N-terminal kinase (JNK). Here, we investigated the functional role of the LMP1-induced JNK pathway in cell transformation. Expression of a novel dominant-negative JNK1 allele caused a block of proliferation in LMP1-transformed Rat1 fibroblasts. The JNK-specific inhibitor SP600125 reproduced this effect in Rat1-LMP1 cells and efficiently interfered with proliferation of EBV-transformed lymphoblastoid cells (LCLs). Inhibition of the LMP1-induced JNK pathway in LCLs caused the downregulation of c-Jun and Cdc2, the essential G2/M cell cycle kinase, which was accompanied bymore » a cell cycle arrest of LCLs at G2/M phase transition. Moreover, SP600125 retarded tumor growth of LCLs in a xenograft model in SCID mice. Our data support a critical role of the LMP1-induced JNK pathway for proliferation of LMP1-transformed cells and characterize JNK as a potential target for intervention against EBV-induced malignancies.« less
Droplet microfluidics for synthetic biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gach, PC; Iwai, K; Kim, PW
2017-01-01
© 2017 The Royal Society of Chemistry. Synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselvesmore » expensive and thus inaccessible to most researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
Toh, Su San; Treves, David S; Barati, Michelle T; Perlin, Michael H
2016-10-01
Microbotryum lychnidis-dioicae is a member of a species complex infecting host plants in the Caryophyllaceae. It is used as a model system in many areas of research, but attempts to make this organism tractable for reverse genetic approaches have not been fruitful. Here, we exploited the recently obtained genome sequence and transcriptome analysis to inform our design of constructs for use in Agrobacterium-mediated transformation techniques currently available for other fungi. Reproducible transformation was demonstrated at the genomic, transcriptional and functional levels. Moreover, these initial proof-of-principle experiments provide evidence that supports the findings from initial global transcriptome analysis regarding expression from the respective promoters under different growth conditions of the fungus. The technique thus provides for the first time the ability to stably introduce transgenes and over-express target M. lychnidis-dioicae genes.
[Attachment theory and baby slings/carriers: technological network formation].
Lu, Zxy-Yann Jane; Lin, Wan-Shiuan
2011-12-01
Healthcare providers recognize the important role played by attachment theory in explaining the close relationship between mental health and social behavior in mothers and their children. This paper uses attachment theory in a socio-cultural context to ascertain the mechanism by which baby slings/carriers, a new technology, produced and reproduced the scientific motherhood. It further applies a social history of technology perspective to understand how baby carriers and attachment theory are socially constructed and historically contingent on three major transformations. These transformations include the use of attachment theory-based baby carriers to further scientific motherhood; the use of baby slings/carriers to further the medicalization of breastfeeding and enhance mother-infant attachment; and the use of baby slings/carriers to transform woman's identities by integrating scientific motherhood, independence and fashion. Implications for nursing clinical policy are suggested.
NASA Astrophysics Data System (ADS)
Deng, Xian-qin; Fang, Hua; Li, Min-xian
2017-07-01
The GC-PDD with the technology of valve cutting and helium ionization detector was used to analyze the dissolved gases in ultra-high voltage(UHV) and extra-high voltage(EHV) transformer oil. The detection limit(DL) reached ppb grade, especially for the featuring gas—C2H2 and H2, whose DL could reach 5ppb and 11ppb respectively. The test reproducibility of the instrument was about 1% and the correlation coefficient of standard curve-r is greater or equal to 0.99, which showed obvious advantage compared with normal GC. In addition, the auxiliary gas of H2 was not used in this instrument, which completely improved the safety performance. Thus, the application of GC-PDD has significant meaning in warning potential malfunction inside the ultra-high voltage transformer in advance.
Gennaro, G; Ballaminut, A; Contento, G
2017-09-01
This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.
"How to Be a Rural Man": Young Men's Performances and Negotiations of Rural Masculinities
ERIC Educational Resources Information Center
Bye, Linda Marie
2009-01-01
This paper is concerned with young rural men and how they "do" identity politics living in a rural area of Norway. Focusing on how masculinity and rurality are constructed and interrelated in young men's narratives of living in a remote community, it is identified that young rural men reproduce, negotiate and transform local discourses…
Waterside Security 2010 (WSS 2010) Conference: Post Conference Report
2011-02-01
Memorandum Report NURC-MR-2011-002 Waterside Security 2010 (WSS2010) Conference: post conference report Ronald Kessel and...in NATO, NURC conducts maritime research in support of NATO’s operational and transformation requirements. Reporting to the Supreme Allied Commander...independent business process certification. Copyright © NURC 2011. NATO member nations have unlimited rights to use, modify, reproduce, release
Can Education Change Society? Du Bois, Woodson and the Politics of Social Transformation
ERIC Educational Resources Information Center
Apple, Michael W.
2013-01-01
Most nations--and nations to be--have a history of people asking critical questions about schooling and about the politics of knowledge in which it participates. Is it simply reproducing the ideological goals and cultural forms and content of dominant groups? Could schooling be used to raise serious issues about existing societies? Could it go…
Adaptive Multilinear Tensor Product Wavelets
Weiss, Kenneth; Lindstrom, Peter
2015-08-12
Many foundational visualization techniques including isosurfacing, direct volume rendering and texture mapping rely on piecewise multilinear interpolation over the cells of a mesh. However, there has not been much focus within the visualization community on techniques that efficiently generate and encode globally continuous functions defined by the union of multilinear cells. Wavelets provide a rich context for analyzing and processing complicated datasets. In this paper, we exploit adaptive regular refinement as a means of representing and evaluating functions described by a subset of their nonzero wavelet coefficients. We analyze the dependencies involved in the wavelet transform and describe how tomore » generate and represent the coarsest adaptive mesh with nodal function values such that the inverse wavelet transform is exactly reproduced via simple interpolation (subdivision) over the mesh elements. This allows for an adaptive, sparse representation of the function with on-demand evaluation at any point in the domain. In conclusion, we focus on the popular wavelets formed by tensor products of linear B-splines, resulting in an adaptive, nonconforming but crack-free quadtree (2D) or octree (3D) mesh that allows reproducing globally continuous functions via multilinear interpolation over its cells.« less
Demodulation algorithm for optical fiber F-P sensor.
Yang, Huadong; Tong, Xinglin; Cui, Zhang; Deng, Chengwei; Guo, Qian; Hu, Pan
2017-09-10
The demodulation algorithm is very important to improving the measurement accuracy of a sensing system. In this paper, the variable step size hill climbing search method will be initially used for the optical fiber Fabry-Perot (F-P) sensing demodulation algorithm. Compared with the traditional discrete gap transformation demodulation algorithm, the computation is greatly reduced by changing step size of each climb, which could achieve nano-scale resolution, high measurement accuracy, high demodulation rates, and large dynamic demodulation range. An optical fiber F-P pressure sensor based on micro-electro-mechanical system (MEMS) has been fabricated to carry out the experiment, and the results show that the resolution of the algorithm can reach nano-scale level, the sensor's sensitivity is about 2.5 nm/KPa, which is similar to the theoretical value, and this sensor has great reproducibility.
Enhanced antimicrobial activity in biosynthesized ZnO nanoparticles
NASA Astrophysics Data System (ADS)
Kumari, Niraj; Kumari, Priti; Jha, Anal K.; Prasad, K.
2018-05-01
Biological synthesis of different metallic and/or oxide nanoparticles and their applications especially in agriculture and biomedical sciences are gaining prominence nowadays due to their handy and reproducible synthetic protocols which are cost-effective and eco-friendly. In this work, green synthesis of zinc oxide nanoparticles (ZnO NPs) using the alcoholic extract of Azadirachta indica as a reducing and stabilizing agent has been presented. Formation of ZnO NPs was confirmed by X-ray diffraction, scanning and transmission electron microscopy techniques. The phytochemicals responsible for nano-transformation were principally alkaloids, flavanoids, terpenoids, tannins and organic acids present in the Azadirachta indica leaves. The synthesized ZnO NPs were used for antimicrobial assays by disc diffusion method against Staphylococcus aureus and Candida albicans. Results showed that ZnO NPs may act as antimicrobial agent especially against skin infections.
Unsupervised pattern recognition methods in ciders profiling based on GCE voltammetric signals.
Jakubowska, Małgorzata; Sordoń, Wanda; Ciepiela, Filip
2016-07-15
This work presents a complete methodology of distinguishing between different brands of cider and ageing degrees, based on voltammetric signals, utilizing dedicated data preprocessing procedures and unsupervised multivariate analysis. It was demonstrated that voltammograms recorded on glassy carbon electrode in Britton-Robinson buffer at pH 2 are reproducible for each brand. By application of clustering algorithms and principal component analysis visible homogenous clusters were obtained. Advanced signal processing strategy which included automatic baseline correction, interval scaling and continuous wavelet transform with dedicated mother wavelet, was a key step in the correct recognition of the objects. The results show that voltammetry combined with optimized univariate and multivariate data processing is a sufficient tool to distinguish between ciders from various brands and to evaluate their freshness. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sánchez-Ayala, Alfonso; Vilanova, Larissa Soares Reis; Costa, Marina Abrantes; Farias-Neto, Arcelino
2014-01-01
The aim of this study was to evaluate the reproducibility of the condensation silicone Optosil Comfort® as an artificial test food for masticatory performance evaluation. Twenty dentate subjects with mean age of 23.3±0.7 years were selected. Masticatory performance was evaluated using the simple (MPI), the double (IME) and the multiple sieve methods. Trials were carried out five times by three examiners: three times by the first, and once by the second and third examiners. Friedman's test was used to find the differences among time trials. Reproducibility was determined by the intra-class correlation (ICC) test (α=0.05). No differences among time trials were found, except for MPI-4 mm (p=0.022) from the first examiner results. The intra-examiner reproducibility (ICC) of almost all data was high (ICC≥0.92, p<0.001), being moderate only for MPI-0.50 mm (ICC=0.89, p<0.001). The inter-examiner reproducibility was high (ICC>0.93, p<0.001) for all results. For the multiple sieve method, the average mean of absolute difference from repeated measurements were lower than 1 mm. This trend was observed only from MPI-0.50 to MPI-1.4 for the single sieve method, and from IME-0.71/0.50 to IME-1.40/1.00 for the double sieve method. The results suggest that regardless of the method used, the reproducibility of Optosil Comfort® is high.
Ballanger, Bénédicte; Tremblay, Léon; Sgambato-Faure, Véronique; Beaudoin-Gobert, Maude; Lavenne, Franck; Le Bars, Didier; Costes, Nicolas
2013-08-15
MRI templates and digital atlases are needed for automated and reproducible quantitative analysis of non-human primate PET studies. Segmenting brain images via multiple atlases outperforms single-atlas labelling in humans. We present a set of atlases manually delineated on brain MRI scans of the monkey Macaca fascicularis. We use this multi-atlas dataset to evaluate two automated methods in terms of accuracy, robustness and reliability in segmenting brain structures on MRI and extracting regional PET measures. Twelve individual Macaca fascicularis high-resolution 3DT1 MR images were acquired. Four individual atlases were created by manually drawing 42 anatomical structures, including cortical and sub-cortical structures, white matter regions, and ventricles. To create the MRI template, we first chose one MRI to define a reference space, and then performed a two-step iterative procedure: affine registration of individual MRIs to the reference MRI, followed by averaging of the twelve resampled MRIs. Automated segmentation in native space was obtained in two ways: 1) Maximum probability atlases were created by decision fusion of two to four individual atlases in the reference space, and transformation back into the individual native space (MAXPROB)(.) 2) One to four individual atlases were registered directly to the individual native space, and combined by decision fusion (PROPAG). Accuracy was evaluated by computing the Dice similarity index and the volume difference. The robustness and reproducibility of PET regional measurements obtained via automated segmentation was evaluated on four co-registered MRI/PET datasets, which included test-retest data. Dice indices were always over 0.7 and reached maximal values of 0.9 for PROPAG with all four individual atlases. There was no significant mean volume bias. The standard deviation of the bias decreased significantly when increasing the number of individual atlases. MAXPROB performed better when increasing the number of atlases used. When all four atlases were used for the MAXPROB creation, the accuracy of morphometric segmentation approached that of the PROPAG method. PET measures extracted either via automatic methods or via the manually defined regions were strongly correlated, with no significant regional differences between methods. Intra-class correlation coefficients for test-retest data were over 0.87. Compared to single atlas extractions, multi-atlas methods improve the accuracy of region definition. They also perform comparably to manually defined regions for PET quantification. Multiple atlases of Macaca fascicularis brains are now available and allow reproducible and simplified analyses. Copyright © 2013 Elsevier Inc. All rights reserved.
Culture: copying, compression, and conventionality.
Tamariz, Mónica; Kirby, Simon
2015-01-01
Through cultural transmission, repeated learning by new individuals transforms cultural information, which tends to become increasingly compressible (Kirby, Cornish, & Smith, ; Smith, Tamariz, & Kirby, ). Existing diffusion chain studies include in their design two processes that could be responsible for this tendency: learning (storing patterns in memory) and reproducing (producing the patterns again). This paper manipulates the presence of learning in a simple iterated drawing design experiment. We find that learning seems to be the causal factor behind the increase in compressibility observed in the transmitted information, while reproducing is a source of random heritable innovations. Only a theory invoking these two aspects of cultural learning will be able to explain human culture's fundamental balance between stability and innovation. Copyright © 2014 Cognitive Science Society, Inc.
Testing promotes effector transfer.
Boutin, Arnaud; Panzer, Stefan; Salesse, Robin N; Blandin, Yannick
2012-11-01
The retrieval of information from memory during testing has recently been shown to promote transfer in the verbal domain. Motor-related research, however, has ignored testing as a relevant method to enhance motor transfer. We thus investigated whether testing has the potential to induce generalised motor memories by favouring effector transfer. Participants were required to reproduce a spatial-temporal pattern of elbow extensions and flexions with their dominant right arm. We tested the ability of participants to transfer the original pattern (extrinsic transformation; i.e., goal-based configuration) or the mirrored pattern (intrinsic transformation; i.e., movement-based configuration) to the unpractised non-dominant left arm. To evaluate how testing affects motor transfer at 24-h testing, participants were either administered an initial testing session during early practice (early testing group) or shortly after the end of practice (late testing group; i.e., no alternation between practice and testing sessions). No initial testing session was completed for the control group. We found better effector transfer at 24-h testing for the early testing group for both extrinsic and intrinsic transformations of the movement pattern when compared with the control group, while no testing benefit was observed for the late testing group. This indicates that testing positively affects motor learning, yielding enhanced long-term transfer capabilities. We thus demonstrate the critical role of retrieval practice via testing during the process of motor memory encoding, and provide the conditions under which testing effectively contributes to the generalisation of motor memories. Copyright © 2012 Elsevier B.V. All rights reserved.
Development of delineator testing standard.
DOT National Transportation Integrated Search
2015-02-01
The objective of this project was to develop a new test method for evaluating the impact performance : of delineators for given applications. The researchers focused on developing a test method that was : reproducible and attempted to reproduce failu...
Comparison of reproducibility of natural head position using two methods.
Khan, Abdul Rahim; Rajesh, R N G; Dinesh, M R; Sanjay, N; Girish, K S; Venkataraghavan, Karthik
2012-01-01
Lateral cephalometric radiographs have become virtually indispensable to orthodontists in the treatment of patients. They are important in orthodontic growth analysis, diagnosis, treatment planning, monitoring of therapy and evaluation of final treatment outcome. The purpose of this study was to evaluate and compare the maximum reproducibility with minimum variation of natural head position using two methods, i.e. the mirror method and the fluid level device method. The study included two sets of 40 lateral cephalograms taken using two methods of obtaining natural head position: (1) The mirror method and (2) fluid level device method, with a time interval of 2 months. Inclusion criteria • Subjects were randomly selected aged between 18 to 26 years Exclusion criteria • History of orthodontic treatment • Any history of respiratory tract problem or chronic mouth breathing • Any congenital deformity • History of traumatically-induced deformity • History of myofacial pain syndrome • Any previous history of head and neck surgery. The result showed that both the methods for obtaining natural head position-the mirror method and fluid level device method were comparable, but maximum reproducibility was more with the fluid level device as shown by the Dahlberg's coefficient and Bland-Altman plot. The minimum variance was seen with the fluid level device method as shown by Precision and Pearson correlation. The mirror method and the fluid level device method used for obtaining natural head position were comparable without any significance, and the fluid level device method was more reproducible and showed less variance when compared to mirror method for obtaining natural head position. Fluid level device method was more reproducible and shows less variance when compared to mirror method for obtaining natural head position.
NASA Astrophysics Data System (ADS)
Stegen, Ronald; Gassmann, Matthias
2017-04-01
The use of a broad variation of agrochemicals is essential for the modern industrialized agriculture. During the last decades, the awareness of the side effects of their use has grown and with it the requirement to reproduce, understand and predict the behaviour of these agrochemicals in the environment, in order to optimize their use and minimize the side effects. The modern modelling has made great progress in understanding and predicting these chemicals with digital methods. While the behaviour of the applied chemicals is often investigated and modelled, most studies only simulate parent chemicals, considering total annihilation of the substance. However, due to a diversity of chemical, physical and biological processes, the substances are rather transformed into new chemicals, which themselves are transformed until, at the end of the chain, the substance is completely mineralized. During this process, the fate of each transformation product is determined by its own environmental characteristics and the pathway and results of transformation can differ largely by substance and environmental influences, that can occur in different compartments of the same site. Simulating transformation products introduces additional model uncertainties. Thus, the calibration effort increases compared to simulations of the transport and degradation of the primary substance alone. The simulation of the necessary physical processes needs a lot of calculation time. Due to that, few physically-based models offer the possibility to simulate transformation products at all, mostly at the field scale. The few models available for the catchment scale are not optimized for this duty, i.e. they are only able to simulate a single parent compound and up to two transformation products. Thus, for simulations of large physico-chemical parameter spaces, the enormous calculation time of the underlying hydrological model diminishes the overall performance. In this study, the structure of the model ZIN-AGRITRA is re-designed for the transport and transformation of an unlimited amount of agrochemicals in the soil-water-plant system at catchment scale. The focus is, besides a good hydrological standard, on a flexible variation of transformation processes and the optimization for the use of large numbers of different substances. Due to the new design, a reduction of the calculation time per tested substance is acquired, allowing faster testing of parameter spaces. Additionally, the new concept allows for the consideration of different transformation processes and products in different environmental compartments. A first test of calculation time improvements and flexible transformation pathways was performed in a Mediterranean meso-scale catchment, using the insecticide Chlorpyrifos and two of its transformation products, which emerge from different transformation processes, as test substances.
Ouadah, S.; Stayman, J. W.; Gang, G.; Uneri, A.; Ehtiati, T.; Siewerdsen, J. H.
2015-01-01
Purpose Robotic C-arm systems are capable of general noncircular orbits whose trajectories can be driven by the particular imaging task. However obtaining accurate calibrations for reconstruction in such geometries can be a challenging problem. This work proposes a method to perform a unique geometric calibration of an arbitrary C-arm orbit by registering 2D projections to a previously acquired 3D image to determine the transformation parameters representing the system geometry. Methods Experiments involved a cone-beam CT (CBCT) bench system, a robotic C-arm, and three phantoms. A robust 3D-2D registration process was used to compute the 9 degree of freedom (DOF) transformation between each projection and an existing 3D image by maximizing normalized gradient information with a digitally reconstructed radiograph (DRR) of the 3D volume. The quality of the resulting “self-calibration” was evaluated in terms of the agreement with an established calibration method using a BB phantom as well as image quality in the resulting CBCT reconstruction. Results The self-calibration yielded CBCT images without significant difference in spatial resolution from the standard (“true”) calibration methods (p-value >0.05 for all three phantoms), and the differences between CBCT images reconstructed using the “self” and “true” calibration methods were on the order of 10−3 mm−1. Maximum error in magnification was 3.2%, and back-projection ray placement was within 0.5 mm. Conclusion The proposed geometric “self” calibration provides a means for 3D imaging on general non-circular orbits in CBCT systems for which a geometric calibration is either not available or not reproducible. The method forms the basis of advanced “task-based” 3D imaging methods now in development for robotic C-arms. PMID:26388661
Saha, Punam K; Xu, Yan; Duan, Hong; Heiner, Anneliese; Liang, Guoyuan
2010-11-01
Trabecular bone (TB) is a complex quasi-random network of interconnected plates and rods. TB constantly remodels to adapt to the stresses to which it is subjected (Wolff's Law). In osteoporosis, this dynamic equilibrium between bone formation and resorption is perturbed, leading to bone loss and structural deterioration. Both bone loss and structural deterioration increase fracture risk. Bone's mechanical behavior can only be partially explained by variations in bone mineral density, which led to the notion of bone structural quality. Previously, we developed digital topological analysis (DTA) which classifies plates, rods, profiles, edges, and junctions in a TB skeletal representation. Although the method has become quite popular, a major limitation of DTA is that it provides only hard classifications of different topological entities, failing to distinguish between narrow and wide plates. Here, we present a new method called volumetric topological analysis (VTA) for regional quantification of TB topology. At each TB location, the method uniquely classifies its topology on the continuum between perfect plates and perfect rods, facilitating early detections of TB alterations from plates to rods according to the known etiology of osteoporotic bone loss. Several new ideas, including manifold distance transform, manifold scale, and feature propagation have been introduced here and combined with existing DTA and distance transform methods, leading to the new VTA technology. This method has been applied to multidetector computed tomography (CT) and micro-computed tomography ( μCT) images of four cadaveric distal tibia and five distal radius specimens. Both intra- and inter-modality reproducibility of the method has been examined using repeat CT and μCT scans of distal tibia specimens. Also, the method's ability to predict experimental biomechanical properties of TB via CT imaging under in vivo conditions has been quantitatively examined and the results found are very encouraging.
Palacios, Cristina; Trak, Maria Angelica; Betancourt, Jesmari; Joshipura, Kaumudi; Tucker, Katherine L
2015-10-01
We aimed to assess the relative validity and reproducibility of a semi-quantitative FFQ in Puerto Rican adults. Participants completed an FFQ, followed by a 6 d food record and a second administration of the FFQ, 30 d later. All nutrients were log transformed and adjusted for energy intake. Statistical analyses included correlations, paired t tests, cross-classification and Bland-Altman plots. Medical Sciences Campus, University of Puerto Rico. Convenience sample of students, employees and faculty members (n 100, ≥21 years). Data were collected in 2010. A total of ninety-two participants completed the study. Most were young overweight females. All nutrients were significantly correlated between the two FFQ, with an average correlation of 0·61 (range 0·43-0·73) and an average difference of 4·8 % between them. Most energy-adjusted nutrients showed significant correlations between the FFQ and food record, which improved with de-attenuation and averaged 0·38 (range 0·11-0·63). The lowest non-significant correlations (≤0·20) were for trans-fat, n 3 fatty acids, thiamin and vitamin E. Intakes assessed by the FFQ were higher than those from the food record by a mean of 19 % (range 4-44 %). Bland-Altman plots showed that there was a systematic trend towards higher estimates with the FFQ, particularly for energy, carbohydrate and Ca. Most participants were correctly classified into the same or adjacent quintile (average 66 %) by both methods with only 3 % gross misclassification. This semi-quantitative FFQ is a tool that offers relatively valid and reproducible estimates of energy and certain nutrients in this group of mostly female Puerto Ricans.
Morris, Olivia; McMahon, Adam; Boutin, Herve; Grigg, Julian; Prenant, Christian
2016-06-15
[(18) F]Fluoroacetaldehyde is a biocompatible prosthetic group that has been implemented pre-clinically using a semi-automated remotely controlled system. Automation of radiosyntheses permits use of higher levels of [(18) F]fluoride whilst minimising radiochemist exposure and enhancing reproducibility. In order to achieve full-automation of [(18) F]fluoroacetaldehyde peptide radiolabelling, a customised GE Tracerlab FX-FN with fully programmed automated synthesis was developed. The automated synthesis of [(18) F]fluoroacetaldehyde is carried out using a commercially available precursor, with reproducible yields of 26% ± 3 (decay-corrected, n = 10) within 45 min. Fully automated radiolabelling of a protein, recombinant human interleukin-1 receptor antagonist (rhIL-1RA), with [(18) F]fluoroacetaldehyde was achieved within 2 h. Radiolabelling efficiency of rhIL-1RA with [(18) F]fluoroacetaldehyde was confirmed using HPLC and reached 20% ± 10 (n = 5). Overall RCY of [(18) F]rhIL-1RA was 5% ± 2 (decay-corrected, n = 5) within 2 h starting from 35 to 40 GBq of [(18) F]fluoride. Specific activity measurements of 8.11-13.5 GBq/µmol were attained (n = 5), a near three-fold improvement of those achieved using the semi-automated approach. The strategy can be applied to radiolabelling a range of peptides and proteins with [(18) F]fluoroacetaldehyde analogous to other aldehyde-bearing prosthetic groups, yet automation of the method provides reproducibility thereby aiding translation to Good Manufacturing Practice manufacture and the transformation from pre-clinical to clinical production. Copyright © 2016 The Authors. Journal of Labelled Compounds and Radiopharmaceuticals published by John Wiley & Sons, Ltd.
Takács, Péter
2016-01-01
We compared the repeatability, reproducibility (intra- and inter-measurer similarity), separative power and subjectivity (measurer effect on results) of four morphometric methods frequently used in ichthyological research, the “traditional” caliper-based (TRA) and truss-network (TRU) distance methods and two geometric methods that compare landmark coordinates on the body (GMB) and scales (GMS). In each case, measurements were performed three times by three measurers on the same specimen of three common cyprinid species (roach Rutilus rutilus (Linnaeus, 1758), bleak Alburnus alburnus (Linnaeus, 1758) and Prussian carp Carassius gibelio (Bloch, 1782)) collected from three closely-situated sites in the Lake Balaton catchment (Hungary) in 2014. TRA measurements were made on conserved specimens using a digital caliper, while TRU, GMB and GMS measurements were undertaken on digital images of the bodies and scales. In most cases, intra-measurer repeatability was similar. While all four methods were able to differentiate the source populations, significant differences were observed in their repeatability, reproducibility and subjectivity. GMB displayed highest overall repeatability and reproducibility and was least burdened by measurer effect. While GMS showed similar repeatability to GMB when fish scales had a characteristic shape, it showed significantly lower reproducability (compared with its repeatability) for each species than the other methods. TRU showed similar repeatability as the GMS. TRA was the least applicable method as measurements were obtained from the fish itself, resulting in poor repeatability and reproducibility. Although all four methods showed some degree of subjectivity, TRA was the only method where population-level detachment was entirely overwritten by measurer effect. Based on these results, we recommend a) avoidance of aggregating different measurer’s datasets when using TRA and GMS methods; and b) use of image-based methods for morphometric surveys. Automation of the morphometric workflow would also reduce any measurer effect and eliminate measurement and data-input errors. PMID:27327896
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin
2015-05-05
Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Two-Lens, Anamorphic, Brewster-Angle, Fourier-Transform Relay
NASA Astrophysics Data System (ADS)
Berggren, Ralph R.
1987-06-01
A two-lens system provides a simple and versatile means to relay a laser beam. The pair of lenses can provide true volume imaging, reproducing both amplitude and phase of the input beam. By using cylindrical lenses it is possible to change the aspect ratio of the beam. By adjusting the cylindrical curvatures, it is possible to minimize reflections by tilting the lenses at the Brewster angle.
ERIC Educational Resources Information Center
Desjardins, Richard
2015-01-01
This article briefly reviews the evolving role of major institutions thought to form, reproduce and transform individual as well as collective identities and values, with an emphasis on the impact of state vs market forces via educational systems. This is accompanied by a discussion of various pressures against the state to exert social control on…
The synthesis of active pharmaceutical ingredients (APIs) using continuous flow chemistry
2015-01-01
Summary The implementation of continuous flow processing as a key enabling technology has transformed the way we conduct chemistry and has expanded our synthetic capabilities. As a result many new preparative routes have been designed towards commercially relevant drug compounds achieving more efficient and reproducible manufacture. This review article aims to illustrate the holistic systems approach and diverse applications of flow chemistry to the preparation of pharmaceutically active molecules, demonstrating the value of this strategy towards every aspect ranging from synthesis, in-line analysis and purification to final formulation and tableting. Although this review will primarily concentrate on large scale continuous processing, additional selected syntheses using micro or meso-scaled flow reactors will be exemplified for key transformations and process control. It is hoped that the reader will gain an appreciation of the innovative technology and transformational nature that flow chemistry can leverage to an overall process. PMID:26425178
NASA Astrophysics Data System (ADS)
Marqués, Diego; Nuñez, Carmen A.
2015-10-01
We construct an O( d, d) invariant universal formulation of the first-order α'-corrections of the string effective actions involving the dilaton, metric and two-form fields. Two free parameters interpolate between four-derivative terms that are even and odd with respect to a Z 2-parity transformation that changes the sign of the two-form field. The Z 2-symmetric model reproduces the closed bosonic string, and the heterotic string effective action is obtained through a Z 2-parity-breaking choice of parameters. The theory is an extension of the generalized frame formulation of Double Field Theory, in which the gauge transformations are deformed by a first-order generalized Green-Schwarz transformation. This deformation defines a duality covariant gauge principle that requires and fixes the four-derivative terms. We discuss the O( d, d) structure of the theory and the (non-)covariance of the required field redefinitions.
Effect of Slag Composition on the Crystallization Kinetics of Synthetic CaO-SiO2-Al2O3-MgO Slags
NASA Astrophysics Data System (ADS)
Esfahani, Shaghayegh; Barati, Mansoor
2018-04-01
The crystallization kinetics of CaO-SiO2-Al2O3-MgO (CSAM) slags was studied with the aid of single hot thermocouple technique (SHTT). Kinetic parameters such as the Avrami exponent ( n), rate coefficient ( K), and effective activation energy of crystallization ( E A ) were obtained by kinetic analysis of data obtained from in situ observation of glassy to crystalline transformation and image analysis. Also, the dependence of nucleation and growth rates of crystalline phases were quantified as a function of time, temperature, and slag basicity. Together with the observations of crystallization front, they facilitated establishing the dominant mechanisms of crystallization. In an attempt to predict crystallization rate under non-isothermal conditions, a mathematical model was developed that employs the rate data of isothermal transformation. The model was validated by reproducing an experimental continuous cooling transformation diagram purely from isothermal data.
The synthesis of active pharmaceutical ingredients (APIs) using continuous flow chemistry.
Baumann, Marcus; Baxendale, Ian R
2015-01-01
The implementation of continuous flow processing as a key enabling technology has transformed the way we conduct chemistry and has expanded our synthetic capabilities. As a result many new preparative routes have been designed towards commercially relevant drug compounds achieving more efficient and reproducible manufacture. This review article aims to illustrate the holistic systems approach and diverse applications of flow chemistry to the preparation of pharmaceutically active molecules, demonstrating the value of this strategy towards every aspect ranging from synthesis, in-line analysis and purification to final formulation and tableting. Although this review will primarily concentrate on large scale continuous processing, additional selected syntheses using micro or meso-scaled flow reactors will be exemplified for key transformations and process control. It is hoped that the reader will gain an appreciation of the innovative technology and transformational nature that flow chemistry can leverage to an overall process.
Glass polymorphism in amorphous germanium probed by first-principles computer simulations
NASA Astrophysics Data System (ADS)
Mancini, G.; Celino, M.; Iesari, F.; Di Cicco, A.
2016-01-01
The low-density (LDA) to high-density (HDA) transformation in amorphous Ge at high pressure is studied by first-principles molecular dynamics simulations in the framework of density functional theory. Previous experiments are accurately reproduced, including the presence of a well-defined LDA-HDA transition above 8 GPa. The LDA-HDA density increase is found to be about 14%. Pair and bond-angle distributions are obtained in the 0-16 GPa pressure range and allowed us a detailed analysis of the transition. The local fourfold coordination is transformed in an average HDA sixfold coordination associated with different local geometries as confirmed by coordination number analysis and shape of the bond-angle distributions.
Method for reproducibly preparing a low-melting high-carbon yield precursor
Smith, Wesley E.; Napier, Jr., Bradley
1978-01-01
The present invention is directed to a method for preparing a reproducible synthetic carbon precursor by the autoclave polymerization of indene (C.sub.9 H.sub.8) at a temperature in the range of 470.degree.-485.degree. C, and at a pressure in the range of about 1000 to about 4300 psi. Volatiles in the resulting liquid indene polymer are removed by vacuum outgassing to form a solid carbon precursor characterized by having a relatively low melting temperature, high-carbon yield, and high reproducibility which provide for the fabrication of carbon and graphite composites having strict requirements for reproducible properties.
An open science peer review oath.
Aleksic, Jelena; Alexa, Adrian; Attwood, Teresa K; Chue Hong, Neil; Dahlö, Martin; Davey, Robert; Dinkel, Holger; Förstner, Konrad U; Grigorov, Ivo; Hériché, Jean-Karim; Lahti, Leo; MacLean, Dan; Markie, Michael L; Molloy, Jenny; Schneider, Maria Victoria; Scott, Camille; Smith-Unna, Richard; Vieira, Bruno Miguel
2014-01-01
One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously) uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions.
NASA Astrophysics Data System (ADS)
Lee, Taesam
2018-05-01
Multisite stochastic simulations of daily precipitation have been widely employed in hydrologic analyses for climate change assessment and agricultural model inputs. Recently, a copula model with a gamma marginal distribution has become one of the common approaches for simulating precipitation at multiple sites. Here, we tested the correlation structure of the copula modeling. The results indicate that there is a significant underestimation of the correlation in the simulated data compared to the observed data. Therefore, we proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations. We used the full relationship between the correlation of the observed data and the normally transformed data. Although this indirect method offers certain improvements in preserving the cross-correlations between sites in the original domain, the method was not reliable in application. Therefore, we further improved a simulation-based method (SBM) that was developed to model the multisite precipitation occurrence. The SBM preserved well the cross-correlations of the original domain. The SBM method provides around 0.2 better cross-correlation than the direct method and around 0.1 degree better than the indirect method. The three models were applied to the stations in the Nakdong River basin, and the SBM was the best alternative for reproducing the historical cross-correlation. The direct method significantly underestimates the correlations among the observed data, and the indirect method appeared to be unreliable.
Chemical Topic Modeling: Exploring Molecular Data Sets Using a Common Text-Mining Approach.
Schneider, Nadine; Fechner, Nikolas; Landrum, Gregory A; Stiefl, Nikolaus
2017-08-28
Big data is one of the key transformative factors which increasingly influences all aspects of modern life. Although this transformation brings vast opportunities it also generates novel challenges, not the least of which is organizing and searching this data deluge. The field of medicinal chemistry is not different: more and more data are being generated, for instance, by technologies such as DNA encoded libraries, peptide libraries, text mining of large literature corpora, and new in silico enumeration methods. Handling those huge sets of molecules effectively is quite challenging and requires compromises that often come at the expense of the interpretability of the results. In order to find an intuitive and meaningful approach to organizing large molecular data sets, we adopted a probabilistic framework called "topic modeling" from the text-mining field. Here we present the first chemistry-related implementation of this method, which allows large molecule sets to be assigned to "chemical topics" and investigating the relationships between those. In this first study, we thoroughly evaluate this novel method in different experiments and discuss both its disadvantages and advantages. We show very promising results in reproducing human-assigned concepts using the approach to identify and retrieve chemical series from sets of molecules. We have also created an intuitive visualization of the chemical topics output by the algorithm. This is a huge benefit compared to other unsupervised machine-learning methods, like clustering, which are commonly used to group sets of molecules. Finally, we applied the new method to the 1.6 million molecules of the ChEMBL22 data set to test its robustness and efficiency. In about 1 h we built a 100-topic model of this large data set in which we could identify interesting topics like "proteins", "DNA", or "steroids". Along with this publication we provide our data sets and an open-source implementation of the new method (CheTo) which will be part of an upcoming version of the open-source cheminformatics toolkit RDKit.
Cho, M-J; Yano, H; Okamoto, D; Kim, H-K; Jung, H-R; Newcomb, K; Le, V K; Yoo, H S; Langham, R; Buchanan, B B; Lemaux, P G
2004-02-01
A highly efficient and reproducible transformation system for rice ( Oryza sativa L. cv. Taipei 309) was developed using microprojectile bombardment of highly regenerative, green tissues. These tissues were induced from mature seeds on NB-based medium containing 2,4-dichlorophenoxyacetic acid (2,4-D), 6-benzylaminopurine (BAP) and high concentrations of cupric sulfate under dim light conditions; germinating shoots and roots were completely removed. Highly regenerative, green tissues were proliferated on the same medium and used as transformation targets. From 431 explants bombarded with transgenes [i.e. a hygromycin phosphotransferase ( hpt) gene plus one of a wheat thioredoxin h ( wtrxh), a barley NADP-thioredoxin reductase ( bntr), a maize Mutator transposable element ( mudrB) or beta-glucuronidase ( uidA; gus) gene], 28 independent transgenic events were obtained after an 8- to 12-week selection period, giving a 6.5% transformation frequency. Of the 28 independent events, 17 (61%) were regenerable. Co-transformation of the second introduced transgene was detected in 81% of the transgenic lines tested. Stable integration and expression of the foreign genes in T(0) plants and T(1) progeny were confirmed by DNA hybridization, western blot analyses and germination tests.
Program Correctness, Verification and Testing for Exascale (Corvette)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Koushik; Iancu, Costin; Demmel, James W
The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less
Skills-Based Learning for Reproducible Expertise: Looking Elsewhere for Guidance
ERIC Educational Resources Information Center
Roessger, Kevin M.
2016-01-01
Despite the prevalence of adult skills-based learning, adult education researchers continue to ignore effective interdisciplinary skills-based methods. Prominent researchers dismiss empirically supported teaching guidelines, preferring situational, emancipatory methods with no demonstrable effect on skilled performance or reproducible expertise.…
Quantitative fiber-optic Raman spectroscopy for tissue Raman measurements
NASA Astrophysics Data System (ADS)
Duraipandian, Shiyamala; Bergholt, Mads; Zheng, Wei; Huang, Zhiwei
2014-03-01
Molecular profiling of tissue using near-infrared (NIR) Raman spectroscopy has shown great promise for in vivo detection and prognostication of cancer. The Raman spectra measured from the tissue generally contain fundamental information about the absolute biomolecular concentrations in tissue and its changes associated with disease transformation. However, producing analogues tissue Raman spectra present a great technical challenge. In this preliminary study, we propose a method to ensure the reproducible tissue Raman measurements and validated with the in vivo Raman spectra (n=150) of inner lip acquired using different laser powers (i.e., 30 and 60 mW). A rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe was utilized for tissue Raman measurements. The investigational results showed that the variations between the spectra measured with different laser powers are almost negligible, facilitating the quantitative analysis of tissue Raman measurements in vivo.
von Aderkas, Eleanor L; Barsan, Mirela M; Gilson, Denis F R; Butler, Ian S
2010-12-01
Fourier-transform photoacoustic infrared (PAIR) spectroscopy has been used in the analysis of 12 inorganic pigments commonly in use by artists today, viz., cobalt blue, ultramarine blue, Prussian blue, azurite, malachite, chromium oxide, viridian, cadmium yellow, chrome yellow, iron oxide, yellow ochre and Mars orange. The authenticity of these 12 commercial pigments was first established by recording their Raman spectra. The subsequent PAIR spectra were highly reproducible and matched well in the mid-IR region with previously published data for these pigments. A number of additional overtone and combination bands were also detected that will prove useful in the identification of the pigments in the future. The PAIR technique is a promising and reliable method for the analysis of inorganic pigments, especially since it involves much simpler preparation than is required for conventional IR measurements. Copyright © 2010. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiot, Fabien; Roger, Jean Paul
2006-10-20
We propose to use a Nomarski imaging interferometer to measure the out-of-plane displacement field of micro-electro-mechanical systems. It is shown that the measured optical phase arises from both height and slope gradients. By using four integrating buckets, a more efficient approach to unwrap the measured phase is presented,thus making the method well suited for highly curved objects. Slope and height effects are then decoupled by expanding the displacement field on a functions basis, and the inverse transformation is applied to get a displacement field from a measured optical phase map change with a mechanical loading. A measurement reproducibility of approximatelymore » 10 pm is achieved, and typical results are shown on a microcantilever under thermal actuation, thereby proving the ability of such a setup to provide a reliable full-field kinematic measurement without surface modification.« less
Malmström, Erik; Kilsgård, Ola; Hauri, Simon; Smeds, Emanuel; Herwald, Heiko; Malmström, Lars; Malmström, Johan
2016-01-01
The plasma proteome is highly dynamic and variable, composed of proteins derived from surrounding tissues and cells. To investigate the complex processes that control the composition of the plasma proteome, we developed a mass spectrometry-based proteomics strategy to infer the origin of proteins detected in murine plasma. The strategy relies on the construction of a comprehensive protein tissue atlas from cells and highly vascularized organs using shotgun mass spectrometry. The protein tissue atlas was transformed to a spectral library for highly reproducible quantification of tissue-specific proteins directly in plasma using SWATH-like data-independent mass spectrometry analysis. We show that the method can determine drastic changes of tissue-specific protein profiles in blood plasma from mouse animal models with sepsis. The strategy can be extended to several other species advancing our understanding of the complex processes that contribute to the plasma proteome dynamics. PMID:26732734
Automated Tumor Volumetry Using Computer-Aided Image Segmentation
Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos
2015-01-01
Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633
Grams, Samantha Torres; Kimoto, Karen Yumi Mota; Azevedo, Elen Moda de Oliveira; Lança, Marina; Albuquerque, André Luis Pereira de; Brito, Christina May Moran de; Yamaguti, Wellington Pereira
2015-01-01
Maximal Inspiratory Pressure (MIP) is considered an effective method to estimate strength of inspiratory muscles, but still leads to false positive diagnosis. Although MIP assessment with unidirectional expiratory valve method has been used in patients undergoing mechanical ventilation, no previous studies investigated the application of this method in subjects without artificial airway. This study aimed to compare the MIP values assessed by standard method (MIPsta) and by unidirectional expiratory valve method (MIPuni) in subjects with spontaneous breathing without artificial airway. MIPuni reproducibility was also evaluated. This was a crossover design study, and 31 subjects performed MIPsta and MIPuni in a random order. MIPsta measured MIP maintaining negative pressure for at least one second after forceful expiration. MIPuni evaluated MIP using a unidirectional expiratory valve attached to a face mask and was conducted by two evaluators (A and B) at two moments (Tests 1 and 2) to determine interobserver and intraobserver reproducibility of MIP values. Intraclass correlation coefficient (ICC[2,1]) was used to determine intraobserver and interobserver reproducibility. The mean values for MIPuni were 14.3% higher (-117.3 ± 24.8 cmH2O) than the mean values for MIPsta (-102.5 ± 23.9 cmH2O) (p<0.001). Interobserver reproducibility assessment showed very high correlation for Test 1 (ICC[2,1] = 0.91), and high correlation for Test 2 (ICC[2,1] = 0.88). The assessment of the intraobserver reproducibility showed high correlation for evaluator A (ICC[2,1] = 0.86) and evaluator B (ICC[2,1] = 0.77). MIPuni presented higher values when compared with MIPsta and proved to be reproducible in subjects with spontaneous breathing without artificial airway.
Discrete fourier transform (DFT) analysis for applications using iterative transform methods
NASA Technical Reports Server (NTRS)
Dean, Bruce H. (Inventor)
2012-01-01
According to various embodiments, a method is provided for determining aberration data for an optical system. The method comprises collecting a data signal, and generating a pre-transformation algorithm. The data is pre-transformed by multiplying the data with the pre-transformation algorithm. A discrete Fourier transform of the pre-transformed data is performed in an iterative loop. The method further comprises back-transforming the data to generate aberration data.
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
NASA Astrophysics Data System (ADS)
Du, Peijun; Tan, Kun; Xing, Xiaoshi
2010-12-01
Combining Support Vector Machine (SVM) with wavelet analysis, we constructed wavelet SVM (WSVM) classifier based on wavelet kernel functions in Reproducing Kernel Hilbert Space (RKHS). In conventional kernel theory, SVM is faced with the bottleneck of kernel parameter selection which further results in time-consuming and low classification accuracy. The wavelet kernel in RKHS is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. Implications on semiparametric estimation are proposed in this paper. Airborne Operational Modular Imaging Spectrometer II (OMIS II) hyperspectral remote sensing image with 64 bands and Reflective Optics System Imaging Spectrometer (ROSIS) data with 115 bands were used to experiment the performance and accuracy of the proposed WSVM classifier. The experimental results indicate that the WSVM classifier can obtain the highest accuracy when using the Coiflet Kernel function in wavelet transform. In contrast with some traditional classifiers, including Spectral Angle Mapping (SAM) and Minimum Distance Classification (MDC), and SVM classifier using Radial Basis Function kernel, the proposed wavelet SVM classifier using the wavelet kernel function in Reproducing Kernel Hilbert Space is capable of improving classification accuracy obviously.
NASA Astrophysics Data System (ADS)
Schizas, Dimitrios; Papatheodorou, Efimia; Stamou, George
2017-04-01
This study conducts a textbook analysis in the frame of the following working hypothesis: The transformation of scientific knowledge into school knowledge is expected to reproduce the problems encountered with the scientific knowledge itself or generate additional problems, which may both induce misconceptions in textbook users. Specifically, we describe four epistemological problems associated with how the concept of "ecosystem" is elaborated within ecological science and we examine how each problem is reproduced in the biology textbook utilized by Greek students in the 12th grade and the resulting teacher and student misunderstandings that may occur. Our research demonstrates that the authors of the textbook address these problems by appealing simultaneously to holistic and reductionist ideas. This results in a meaningless and confused depiction of "ecosystem" and may provoke many serious misconceptions on the part of textbook users, for example, that an ecosystem is a system that can be applied to every set of interrelated ecological objects irrespective of the organizational level to which these entities belong or how these entities are related to each other. The implications of these phenomena for science education research are discussed from a perspective that stresses the role of background assumptions in the understanding of declarative knowledge.
Vertical structure of mean cross-shore currents across a barred surf zone
Haines, John W.; Sallenger, Asbury H.
1994-01-01
Mean cross-shore currents observed across a barred surf zone are compared to model predictions. The model is based on a simplified momentum balance with a turbulent boundary layer at the bed. Turbulent exchange is parameterized by an eddy viscosity formulation, with the eddy viscosity Aυ independent of time and the vertical coordinate. Mean currents result from gradients due to wave breaking and shoaling, and the presence of a mean setup of the free surface. Descriptions of the wave field are provided by the wave transformation model of Thornton and Guza [1983]. The wave transformation model adequately reproduces the observed wave heights across the surf zone. The mean current model successfully reproduces the observed cross-shore flows. Both observations and predictions show predominantly offshore flow with onshore flow restricted to a relatively thin surface layer. Successful application of the mean flow model requires an eddy viscosity which varies horizontally across the surf zone. Attempts are made to parameterize this variation with some success. The data does not discriminate between alternative parameterizations proposed. The overall variability in eddy viscosity suggested by the model fitting should be resolvable by field measurements of the turbulent stresses. Consistent shortcomings of the parameterizations, and the overall modeling effort, suggest avenues for further development and data collection.
ggCyto: Next Generation Open-Source Visualization Software for Cytometry.
Van, Phu; Jiang, Wenxin; Gottardo, Raphael; Finak, Greg
2018-06-01
Open source software for computational cytometry has gained in popularity over the past few years. Efforts such as FlowCAP, the Lyoplate and Euroflow projects have highlighted the importance of efforts to standardize both experimental and computational aspects of cytometry data analysis. The R/BioConductor platform hosts the largest collection of open source cytometry software covering all aspects of data analysis and providing infrastructure to represent and analyze cytometry data with all relevant experimental, gating, and cell population annotations enabling fully reproducible data analysis. Data visualization frameworks to support this infrastructure have lagged behind. ggCyto is a new open-source BioConductor software package for cytometry data visualization built on ggplot2 that enables ggplot-like functionality with the core BioConductor flow cytometry data structures. Amongst its features are the ability to transform data and axes on-the-fly using cytometry-specific transformations, plot faceting by experimental meta-data variables, and partial matching of channel, marker and cell populations names to the contents of the BioConductor cytometry data structures. We demonstrate the salient features of the package using publicly available cytometry data with complete reproducible examples in a supplementary material vignette. https://bioconductor.org/packages/devel/bioc/html/ggcyto.html. gfinak@fredhutch.org. Supplementary data are available at Bioinformatics online and at http://rglab.org/ggcyto/.
Kremer, Kristin; Arnold, Catherine; Cataldi, Angel; Gutiérrez, M. Cristina; Haas, Walter H.; Panaiotov, Stefan; Skuce, Robin A.; Supply, Philip; van der Zanden, Adri G. M.; van Soolingen, Dick
2005-01-01
In recent years various novel DNA typing methods have been developed which are faster and easier to perform than the current internationally standardized IS6110 restriction fragment length polymorphism typing method. However, there has been no overview of the utility of these novel typing methods, and it is largely unknown how they compare to previously published methods. In this study, the discriminative power and reproducibility of nine recently described PCR-based typing methods for Mycobacterium tuberculosis were investigated using the strain collection of the interlaboratory study of Kremer et al. (J. Clin. Microbiol. 37:2607-2618, 1999). This strain collection contains 90 M. tuberculosis complex and 10 non-M. tuberculosis complex mycobacterial strains, as well as 31 duplicated DNA samples to assess reproducibility. The highest reproducibility was found with variable numbers of tandem repeat typing using mycobacterial interspersed repetitive units (MIRU VNTR) and fast ligation-mediated PCR (FLiP), followed by second-generation spoligotyping, ligation-mediated PCR (LM-PCR), VNTR typing using five repeat loci identified at the Queens University of Belfast (QUB VNTR), and the Amadio speciation PCR. Poor reproducibility was associated with fluorescent amplified fragment length polymorphism typing, which was performed in three different laboratories. The methods were ordered from highest discrimination to lowest by the Hunter-Gaston discriminative index as follows: QUB VNTR typing, MIRU VNTR typing, FLiP, LM-PCR, and spoligotyping. We conclude that both VNTR typing methods and FLiP typing are rapid, highly reliable, and discriminative epidemiological typing methods for M. tuberculosis and that VNTR typing is the epidemiological typing method of choice for the near future. PMID:16272496
An Open Science Peer Review Oath
Aleksic, Jelena; Alexa, Adrian; Attwood, Teresa K; Chue Hong, Neil; Dahlö, Martin; Davey, Robert; Dinkel, Holger; Förstner, Konrad U; Grigorov, Ivo; Hériché, Jean-Karim; Lahti, Leo; MacLean, Dan; Markie, Michael L; Molloy, Jenny; Schneider, Maria Victoria; Scott, Camille; Smith-Unna, Richard; Vieira, Bruno Miguel
2015-01-01
One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously) uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions. PMID:25653839
Enriched reproducing kernel particle method for fractional advection-diffusion equation
NASA Astrophysics Data System (ADS)
Ying, Yuping; Lian, Yanping; Tang, Shaoqiang; Liu, Wing Kam
2018-06-01
The reproducing kernel particle method (RKPM) has been efficiently applied to problems with large deformations, high gradients and high modal density. In this paper, it is extended to solve a nonlocal problem modeled by a fractional advection-diffusion equation (FADE), which exhibits a boundary layer with low regularity. We formulate this method on a moving least-square approach. Via the enrichment of fractional-order power functions to the traditional integer-order basis for RKPM, leading terms of the solution to the FADE can be exactly reproduced, which guarantees a good approximation to the boundary layer. Numerical tests are performed to verify the proposed approach.
Qian, Feizhong; Zhu, Libo; Xu, Nengbin; Feng, Jiayong; Hong, Zhengfang; Xu, Lihong; Chen, Zhongquan; Wang, Shengle
2014-05-01
An ultra performance liquid chromatography/tandem mass spectrometry (UPLC-MS/ MS) method was developed for the determination of picric acid and its reductive transformation product picramic acid in aqueous samples. A hydrophilic interaction liquid chromatography (HILIC) column (Acquity UPLC BEH HILIC; 100 mm x 2.1 mm, 1.7 microm) was used for the separation. Surface water samples could be injected into the UPLC system just after being filtered through a 0.2 microm membrane. The satisfactory recoveries of picric acid and picramic acid were in the range of 89% - 107%. Waste water samples were purified by solid phase extraction (SPE), and then were analyzed. The recoveries of picric acid and picramic acid in waste water were 72%-101%. The reproducibility of the method was good with the RSDs of 4.9% - 14.7%. The limits of detection (LODs) of picric acid and picramic acid were 0.1 microg/L and 0.3 microg/L, respectively. This proposed method is rapid, highly specific and suitable for the confirmation and quantitative determination of picric acid and picramic acid in surface water and waste water.
Automated tumor volumetry using computer-aided image segmentation.
Gaonkar, Bilwaj; Macyszyn, Luke; Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A; Ali, Zarina S; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M; Davatzikos, Christos
2015-05-01
Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0-5 rating scale where 5 indicated perfect segmentation. The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Projection-free approximate balanced truncation of large unstable systems
NASA Astrophysics Data System (ADS)
Flinois, Thibault L. B.; Morgans, Aimee S.; Schmid, Peter J.
2015-08-01
In this article, we show that the projection-free, snapshot-based, balanced truncation method can be applied directly to unstable systems. We prove that even for unstable systems, the unmodified balanced proper orthogonal decomposition algorithm theoretically yields a converged transformation that balances the Gramians (including the unstable subspace). We then apply the method to a spatially developing unstable system and show that it results in reduced-order models of similar quality to the ones obtained with existing methods. Due to the unbounded growth of unstable modes, a practical restriction on the final impulse response simulation time appears, which can be adjusted depending on the desired order of the reduced-order model. Recommendations are given to further reduce the cost of the method if the system is large and to improve the performance of the method if it does not yield acceptable results in its unmodified form. Finally, the method is applied to the linearized flow around a cylinder at Re = 100 to show that it actually is able to accurately reproduce impulse responses for more realistic unstable large-scale systems in practice. The well-established approximate balanced truncation numerical framework therefore can be safely applied to unstable systems without any modifications. Additionally, balanced reduced-order models can readily be obtained even for large systems, where the computational cost of existing methods is prohibitive.
Longo, F; Nicetto, T; Banzato, T; Savio, G; Drigo, M; Meneghello, R; Concheri, G; Isola, M
2018-02-01
The aim of this ex vivo study was to test a novel three-dimensional (3D) automated computer-aided design (CAD) method (aCAD) for the computation of femoral angles in dogs from 3D reconstructions of computed tomography (CT) images. The repeatability and reproducibility of three manual radiography, manual CT reconstructions and the aCAD method for the measurement of three femoral angles were evaluated: (1) anatomical lateral distal femoral angle (aLDFA); (2) femoral neck angle (FNA); and (3) femoral torsion angle (FTA). Femoral angles of 22 femurs obtained from 16 cadavers were measured by three blinded observers. Measurements were repeated three times by each observer for each diagnostic technique. Femoral angle measurements were analysed using a mixed effects linear model for repeated measures to determine the levels of intra-observer agreement (repeatability) and inter-observer agreement (reproducibility). Repeatability and reproducibility of measurements using the aCAD method were excellent (intra-class coefficients, ICCs≥0.98) for all three angles assessed. Manual radiography and CT exhibited excellent agreement for the aLDFA measurement (ICCs≥0.90). However, FNA repeatability and reproducibility were poor (ICCs<0.8), whereas FTA measurement showed slightly higher ICCs values, except for the radiographic reproducibility, which was poor (ICCs<0.8). The computation of the 3D aCAD method provided the highest repeatability and reproducibility among the tested methodologies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rodriguez-Saona, L E; Khambaty, F M; Fry, F S; Dubois, J; Calvey, E M
2004-11-01
The use of Fourier transform-near infrared (FT-NIR) spectroscopy combined with multivariate pattern recognition techniques was evaluated to address the need for a fast and senisitive method for the detection of bacterial contamination in liquids. The complex cellular composition of bacteria produces FT-NIR vibrational transitions (overtone and combination bands), forming the basis for identification and subtyping. A database including strains of Escherichia coli, Pseudomonas aeruginosa, Bacillus subtilis, Bacillus cereus, and Bacillus thuringiensis was built, with special care taken to optimize sample preparation. The bacterial cells were treated with 70% (vol/vol) ethanolto enhance safe handling of pathogenic strains and then concentrated on an aluminum oxide membrane to obtain a thin bacterial film. This simple membrane filtration procedure generated reproducible FT-NIR spectra that allowed for the rapid discrimination among closely related strains. Principal component analysis and soft independent modeling of class analogy of transformed spectra in the region 5,100 to 4,400 cm(-1) were able to discriminate between bacterial species. Spectroscopic analysis of apple juices inoculated with different strains of E. coli at approximately 10(5) CFU/ml showed that FT-NIR spectralfeatures are consistent with bacterial contamination and soft independent modeling of class analogy correctly predicted the identity of the contaminant as strains of E. coli. FT-NIR in conjunction with multivariate techniques can be used for the rapid and accurate evaluation of potential bacterial contamination in liquids with minimal sample manipulation, and hence limited exposure of the laboratory worker to the agents.
NASA Astrophysics Data System (ADS)
Darazi, R.; Gouze, A.; Macq, B.
2009-01-01
Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.
Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E
2015-09-29
In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.
Imamura, Hitoshi; Tabuchi, Hitoshi; Nakakura, Shunsuke; Nagasato, Daisuke; Baba, Hiroaki; Kiuchi, Yoshiaki
2018-04-01
To investigate the usability and the reproducibility of the tear meniscus values via swept-source optical coherence tomography (SS-OCT) and the conventional slit lamp microscope method with a graticule. The right eye was examined in 90 healthy adult subjects who were grouped according to age (group 1: 20-39 years; group 2: 40-59 years; group 3: ≥60 years). The tear meniscus height (TMH) and tear meniscus area were measured using SS-OCT and TMH by the slit lamp microscope method. The reproducibility of each method was calculated using intraclass correlation coefficients (ICCs) in additionally enrolled 30 healthy young subjects. We also evaluated TMH at 3 mm from the corneal center in both temporal and nasal directions using SS-OCT. The mean of the TMH values measured by SS-OCT was significantly higher than those measured by the slit lamp method (328 vs. 212 μm, P < 0.001, respectively). High reproducibility was observed for each method (ICC > 0.75 for both). No statistically significant differences were found in TMH among the age groups using both SS-OCT and slit lamp methods (P = 0.985, 0.380, respectively). TMH values at both sides of the corneal center were significantly smaller than those at the corneal center (P < 0.0001). TMH values obtained by the slit lamp method were lower than those obtained by SS-OCT. However, both methods yielded highly reproducible TMH measurements, suggesting that they are clinically useful. Tear meniscus values did not vary by age but by measurement points in our cohort.
Thomas, Joseph Palathinkal; Srivastava, Saurabh; Zhao, Liyan; Abd-Ellah, Marwa; McGillivray, Donald; Kang, Jung Soo; Rahman, Md Anisur; Moghimi, Nafiseh; Heinig, Nina F; Leung, Kam Tong
2015-04-15
Hybrid solar cells made of poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) ( PSS) and appropriate amounts of a cosolvent and a fluorosurfactant on planar n-type silicon substrates showed a photoconversion efficiency (PCE) of above 13%. These cells also exhibited stable, reproducible, and high external quantum efficiency (EQE) that was not sensitive to light-bias intensity (LBI). In contrast, solar cells made of pristine PSS showed low PCE and high EQE only under certain measurement conditions. The EQE was found to degrade with increasing LBI. Here we report that the LBI-sensitive variation of EQE of the low-PCE cells is related to a reversible structural transformation from a quinoid to a benzoid structure of PEDOT.
Interplay between bulk and edge-bound topological defects in a square micromagnet
Sloetjes, Sam D.; Digernes, Einar; Olsen, Fredrik K.; ...
2018-01-22
A field-driven transformation of a domain pattern in a square micromagnet, defined in a thin film of La 0.7Sr 0.3MnO 3, is discussed in terms of creation and annihilation of bulk vortices and edge-bound topological defects with half-integer winding numbers. The evolution of the domain pattern was mapped with soft x-ray photoemission electron microscopy and magnetic force microscopy. Micromagnetic modeling, permitting detailed analysis of the spin texture, accurately reproduces the measured domain state transformation. The simulations also helped stipulate the energy barriers associated with the creation and annihilation of the topological charges and thus to assess the stability of themore » domain states in this magnetic microstructure.« less
Interplay between bulk and edge-bound topological defects in a square micromagnet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sloetjes, Sam D.; Digernes, Einar; Olsen, Fredrik K.
A field-driven transformation of a domain pattern in a square micromagnet, defined in a thin film of La 0.7Sr 0.3MnO 3, is discussed in terms of creation and annihilation of bulk vortices and edge-bound topological defects with half-integer winding numbers. The evolution of the domain pattern was mapped with soft x-ray photoemission electron microscopy and magnetic force microscopy. Micromagnetic modeling, permitting detailed analysis of the spin texture, accurately reproduces the measured domain state transformation. The simulations also helped stipulate the energy barriers associated with the creation and annihilation of the topological charges and thus to assess the stability of themore » domain states in this magnetic microstructure.« less
Takano, Nami K; Tsutsumi, Takeshi; Suzuki, Hiroshi; Okamoto, Yoshiwo; Nakajima, Toshiaki
2012-02-01
We evaluated whether frequency analysis could detect the development of interstitial fibrosis in rats. SHR/Izm and age-matched WKY/Izm were used. Limb lead II electrocardiograms were recorded. Continuous wavelet transform (CWT) was applied for the time-frequency analysis. The integrated time-frequency power (ITFP) between QRS complexes was measured and compared between groups. The ITFP at low-frequency bands (≤125Hz) was significantly higher in SHR/Izm. The percent change of ITFP showed the different patterns between groups. Prominent interstitial fibrosis with an increase in TIMP-1 mRNA expression was also observed in SHR/Izm. These results were partly reproduced in a computer simulation. Copyright © 2011 Elsevier Ltd. All rights reserved.
An improved method to estimate reflectance parameters for high dynamic range imaging
NASA Astrophysics Data System (ADS)
Li, Shiying; Deguchi, Koichiro; Li, Renfa; Manabe, Yoshitsugu; Chihara, Kunihiro
2008-01-01
Two methods are described to accurately estimate diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness, over the dynamic range of the camera used to capture input images. Neither method needs to segment color areas on an image, or to reconstruct a high dynamic range (HDR) image. The second method improves on the first, bypassing the requirement for specific separation of diffuse and specular reflection components. For the latter method, diffuse and specular reflectance parameters are estimated separately, using the least squares method. Reflection values are initially assumed to be diffuse-only reflection components, and are subjected to the least squares method to estimate diffuse reflectance parameters. Specular reflection components, obtained by subtracting the computed diffuse reflection components from reflection values, are then subjected to a logarithmically transformed equation of the Torrance-Sparrow reflection model, and specular reflectance parameters for gloss intensity and surface roughness are finally estimated using the least squares method. Experiments were carried out using both methods, with simulation data at different saturation levels, generated according to the Lambert and Torrance-Sparrow reflection models, and the second method, with spectral images captured by an imaging spectrograph and a moving light source. Our results show that the second method can estimate the diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness more accurately and faster than the first one, so that colors and gloss can be reproduced more efficiently for HDR imaging.
Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.
Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P
2018-02-23
Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.
Microbiologic tests in epidemiologic studies: are they reproducible?
Aass, A M; Preus, H R; Zambon, J J; Gjermo, P
1994-12-01
Microbiologic assessments are often included in longitudinal studies to elucidate the significance of the association of certain Gram-negative bacteria and the development of periodontal diseases. In such studies, the reliability of methods is crucial. There are several methods to identify putative pathogens, and some of them are commercially available. The purpose of the present study was to compare the reproducibility of four different methods for detecting Actinobacillus actinomycetemcomitans, Porphyromonas gingivalis, and Prevotella intermedia in order to evaluate their usefulness in epidemiologic studies. The test panel consisted of 10 young subjects and 10 adult periodontitis patients. Subgingival plaque was sampled from sites showing bone loss and "healthy" control sites. The four different methods for detecting the target bacteria were 1) cultivation, 2) Evalusite (a chair-side kit based on ELISA), 3) OmniGene, Inc, based on DNA probes, and 4) indirect immunofluorescence (IIF). The test procedure was repeated after a 1-wk interval and was performed by one examiner. Sites reported to be positive for a microorganism by any of the four methods at one or both examinations were considered to be positive for that organism and included in the analysis. The reproducibility of the four methods was low. The IIF and the cultivation methods showed somewhat higher reproducibility than did the commercial systems. A second test was done for Evalusite, three paper points for sampling being used instead of one as described in the manual. The reproducibility of the second test was improved, indicating that the detection level of the system may influence the reliability.
Nyaboga, Evans; Tripathi, Jaindra N.; Manoharan, Rajesh; Tripathi, Leena
2014-01-01
Although genetic transformation of clonally propagated crops has been widely studied as a tool for crop improvement and as a vital part of the development of functional genomics resources, there has been no report of any existing Agrobacterium-mediated transformation of yam (Dioscorea spp.) with evidence of stable integration of T-DNA. Yam is an important crop in the tropics and subtropics providing food security and income to over 300 million people. However, yam production remains constrained by increasing levels of field and storage pests and diseases. A major constraint to the development of biotechnological approaches for yam improvement has been the lack of an efficient and robust transformation and regeneration system. In this study, we developed an Agrobacterium-mediated transformation of Dioscorea rotundata using axillary buds as explants. Two cultivars of D. rotundata were transformed using Agrobacterium tumefaciens harboring the binary vectors containing selectable marker and reporter genes. After selection with appropriate concentrations of antibiotic, shoots were developed on shoot induction and elongation medium. The elongated antibiotic-resistant shoots were subsequently rooted on medium supplemented with selection agent. Successful transformation was confirmed by polymerase chain reaction, Southern blot analysis, and reporter genes assay. Expression of gusA gene in transgenic plants was also verified by reverse transcription polymerase chain reaction analysis. Transformation efficiency varied from 9.4 to 18.2% depending on the cultivars, selectable marker genes, and the Agrobacterium strain used for transformation. It took 3–4 months from Agro-infection to regeneration of complete transgenic plant. Here we report an efficient, fast and reproducible protocol for Agrobacterium-mediated transformation of D. rotundata using axillary buds as explants, which provides a useful platform for future genetic engineering studies in this economically important crop. PMID:25309562
3-D surface profilometry based on modulation measurement by applying wavelet transform method
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao; Wei, Yongchao
2017-01-01
A new analysis of 3-D surface profilometry based on modulation measurement technique by the application of Wavelet Transform method is proposed. As a tool excelling for its multi-resolution and localization in the time and frequency domains, Wavelet Transform method with good localized time-frequency analysis ability and effective de-noizing capacity can extract the modulation distribution more accurately than Fourier Transform method. Especially for the analysis of complex object, more details of the measured object can be well remained. In this paper, the theoretical derivation of Wavelet Transform method that obtains the modulation values from a captured fringe pattern is given. Both computer simulation and elementary experiment are used to show the validity of the proposed method by making a comparison with the results of Fourier Transform method. The results show that the Wavelet Transform method has a better performance than the Fourier Transform method in modulation values retrieval.
Lahiani, Amal; Klaiman, Eldad; Grimm, Oliver
2018-01-01
Context: Medical diagnosis and clinical decisions rely heavily on the histopathological evaluation of tissue samples, especially in oncology. Historically, classical histopathology has been the gold standard for tissue evaluation and assessment by pathologists. The most widely and commonly used dyes in histopathology are hematoxylin and eosin (H&E) as most malignancies diagnosis is largely based on this protocol. H&E staining has been used for more than a century to identify tissue characteristics and structures morphologies that are needed for tumor diagnosis. In many cases, as tissue is scarce in clinical studies, fluorescence imaging is necessary to allow staining of the same specimen with multiple biomarkers simultaneously. Since fluorescence imaging is a relatively new technology in the pathology landscape, histopathologists are not used to or trained in annotating or interpreting these images. Aims, Settings and Design: To allow pathologists to annotate these images without the need for additional training, we designed an algorithm for the conversion of fluorescence images to brightfield H&E images. Subjects and Methods: In this algorithm, we use fluorescent nuclei staining to reproduce the hematoxylin information and natural tissue autofluorescence to reproduce the eosin information avoiding the necessity to specifically stain the proteins or intracellular structures with an additional fluorescence stain. Statistical Analysis Used: Our method is based on optimizing a transform function from fluorescence to H&E images using least mean square optimization. Results: It results in high quality virtual H&E digital images that can easily and efficiently be analyzed by pathologists. We validated our results with pathologists by making them annotate tumor in real and virtual H&E whole slide images and we obtained promising results. Conclusions: Hence, we provide a solution that enables pathologists to assess tissue and annotate specific structures based on multiplexed fluorescence images. PMID:29531846
[Dental arch form reverting by four-point method].
Pan, Xiao-Gang; Qian, Yu-Fen; Weng, Si-En; Feng, Qi-Ping; Yu, Quan
2008-04-01
To explore a simple method of reverting individual dental arch form template for wire bending. Individual dental arch form was reverted by four-point method. By defining central point of bracket on bilateral lower second premolar and first molar, certain individual dental arch form could be generated. The arch form generating procedure was then be developed to computer software for printing arch form. Four-point method arch form was evaluated by comparing with direct model measurement on linear and angular parameters. The accuracy and reproducibility were assessed by paired t test and concordance correlation coefficient with Medcalc 9.3 software package. The arch form by four-point method was of good accuracy and reproducibility (linear concordance correlation coefficient was 0.9909 and angular concordance correlation coefficient was 0.8419). The dental arch form reverted by four-point method could reproduce the individual dental arch form.
NASA Astrophysics Data System (ADS)
Kroffe, K.
2017-12-01
The mission of the Public Library of Science is to accelerate progress in science and medicine by leading a transformation in research communication. Researchers' ability to share their work without restriction is essential, but critical to sharing is open data, transparency in peer review, and an open approach to science assessment. In this session, we will discuss how PLOS ONE collaborates with many different scientific communities to help create, share, and preserve the scholarly works produced by their researchers with emphasis on current common difficulties faced by communities, practical solutions, and a broader view of the importance of open data and reproducibility.
ERIC Educational Resources Information Center
Maisonneuve, Roland
1978-01-01
The whole universe enters the poet's being to be eventually transformed by him/her into musical language. It is this music that the translator must reproduce. Excerpts from the poems of Patmore, Auden, Donne, Joyce and Sloate illustrate the discussion. Translation of mystical and religious poetry is given special attention. (Text is in French.)…
Missel, P J
2000-01-01
Four methods are proposed for modeling diffusion in heterogeneous media where diffusion and partition coefficients take on differing values in each subregion. The exercise was conducted to validate finite element modeling (FEM) procedures in anticipation of modeling drug diffusion with regional partitioning into ocular tissue, though the approach can be useful for other organs, or for modeling diffusion in laminate devices. Partitioning creates a discontinuous value in the dependent variable (concentration) at an intertissue boundary that is not easily handled by available general-purpose FEM codes, which allow for only one value at each node. The discontinuity is handled using a transformation on the dependent variable based upon the region-specific partition coefficient. Methods were evaluated by their ability to reproduce a known exact result, for the problem of the infinite composite medium (Crank, J. The Mathematics of Diffusion, 2nd ed. New York: Oxford University Press, 1975, pp. 38-39.). The most physically intuitive method is based upon the concept of chemical potential, which is continuous across an interphase boundary (method III). This method makes the equation of the dependent variable highly nonlinear. This can be linearized easily by a change of variables (method IV). Results are also given for a one-dimensional problem simulating bolus injection into the vitreous, predicting time disposition of drug in vitreous and retina.
Exploratory Development of Corrosion Inhibiting Primers
1977-05-01
far superior in reproducibility and uniformity. The developed C-5301 electroprimer is readily adaptable to automated processing methods and can provide uniform, reproducible films which are cost effective.
New method for evaluating astringency in red wine.
Llaudy, María C; Canals, Roser; Canals, Joan-Miquel; Rozés, Nicolas; Arola, Lluís; Zamora, Fernando
2004-02-25
Astringency is an important sensory attribute of red wine. It is usually estimated by tasting and is subject to a certain subjectivity. It can also be estimated by using the gelatin index. This procedure is not very reproducible because there are many gelatins on the market with a heterogeneous composition. Furthermore, the gelatin index determines procyanidin concentration by acid hydrolysis that gives only an approximate result. This paper proposes a new and reproducible method that determines astringency by using ovalbumin as the precipitation agent and tannic acid solutions as standards. Statistical analysis of the results indicates that this method is more reproducible (RSD = 5%) than the gelatin index (RSD = 12%) and correlates better with sensorial analysis.
Cremonini, F; Houghton, L A; Camilleri, M; Ferber, I; Fell, C; Cox, V; Castillo, E J; Alpers, D H; Dewit, O E; Gray, E; Lea, R; Zinsmeister, A R; Whorwell, P J
2005-12-01
We assessed reproducibility of measurements of rectal compliance and sensation in health in studies conducted at two centres. We estimated samples size necessary to show clinically meaningful changes in future studies. We performed rectal barostat tests three times (day 1, day 1 after 4 h and 14-17 days later) in 34 healthy participants. We measured compliance and pressure thresholds for first sensation, urgency, discomfort and pain using ascending method of limits and symptom ratings for gas, urgency, discomfort and pain during four phasic distensions (12, 24, 36 and 48 mmHg) in random order. Results obtained at the two centres differed minimally. Reproducibility of sensory end points varies with type of sensation, pressure level and method of distension. Pressure threshold for pain and sensory ratings for non-painful sensations at 36 and 48 mmHg distension were most reproducible in the two centres. Sample size calculations suggested that crossover design is preferable in therapeutic trials: for each dose of medication tested, a sample of 21 should be sufficient to demonstrate 30% changes in all sensory thresholds and almost all sensory ratings. We conclude that reproducibility varies with sensation type, pressure level and distension method, but in a two-centre study, differences in observed results of sensation are minimal and pressure threshold for pain and sensory ratings at 36-48 mmHg of distension are reproducible.
Consistency and reproducibility of the VMAT plan delivery using three independent validation methods
Chandraraj, Varatharaj; Manickam, Ravikumar; Esquivel, Carlos; Supe, Sanjay S; Papanikolaou, Nikos
2010-01-01
The complexity of VMAT delivery requires new methods and potentially new tools for the commissioning of these systems. It appears that great consideration is needed for quality assurance (QA) of these treatments since there are limited devices that are dedicated to the QA of rotational delivery. In this present study, we have evaluated the consistency and reproducibility of one prostate and one lung VMAT plans for 31 consecutive days using three different approaches: 1) MLC DynaLog files, 2) in vivo measurements using the multiwire ionization chamber DAVID, and 3) using PTWseven29 2D ARRAY with the OCTAVIUS phantom at our Varian Clinac linear accelerator. Overall, the three methods of testing the reproducibility and consistency of the VMAT delivery were in agreement with each other. All methods showed minimal daily deviations that contributed to clinically insignificant dose variations from day to day. Based on our results, we conclude that the VMAT delivery using a Varian 2100CD linear accelerator equipped with 120 MLC is highly reproducible. PACS numbers: 87.55.Qr and 87.56.Fc
Di Leo, Giovanni; D'Angelo, Ida Daniela; Alì, Marco; Cannaò, Paola Maria; Mauri, Giovanni; Secchi, Francesco; Sardanelli, Francesco
2017-03-01
The aim of our study was to estimate the intra- and inter-reader reproducibility of blood flow measurements in the ascending aorta and main pulmonary artery using cardiac magnetic resonance (CMR) and a semi-automated segmentation method. The ethics committee approved this retrospective study. A total of 50 consecutive patients (35 males and 15 females; mean age±standard deviation 27±13 years) affected by congenital heart disease were reviewed. They underwent CMR for flow analysis of the ascending aorta and main pulmonary artery (1.5 T, through-plane phase-contrast sequences). Two independent readers (R1, trained radiology resident; R2, lower-trained technician student) obtained segmented images twice (>10-day interval), using a semi-automated method of segmentation. Peak velocity, forward and backward flows were obtained. Bland-Altman analysis was used and reproducibility was reported as complement to 100% of the ratio between the coefficient of repeatability and the mean. R1 intra-reader reproducibility for the aorta was 99% (peak velocity), 95% (forward flow) and 49% (backward flow); for the pulmonary artery, 99%, 91% and 90%, respectively. R2 intra-reader reproducibility was 92%, 91% and 38%; 98%, 86% and 87%, respectively. Inter-reader reproducibility for the aorta was 91%, 85% and 20%; for the pulmonary artery 96%, 75%, and 82%, respectively. Our results showed a good to excellent reproducibility of blood flow measurements of CMR together with a semiautomated method of segmentation, for all variables except the backward flow of the ascending aorta, with a limited impact of operator's training.
Negative pressure driven phase transformation in Sr doped SmCoO₃.
Arshad Farhan, M; Javed Akhtar, M
2010-02-24
Atomistic computer simulation techniques based on energy minimization procedures are utilized for the structural investigation of perovskite-type SmCoO(3). A reliable potential model is derived which reproduces both cubic as well as orthorhombic phases of SmCoO(3). We observe a negative chemical pressure induced structural phase transformation from distorted perovskite (orthorhombic) to perfect perovskite (cubic) due to the substitution of Sr(2 + ) at the Sm(3 + ) sites. However, external hydrostatic pressure shows isotropic compression and no pressure-induced structural transformation is observed up to 100 GPa. To maintain the electroneutrality of the system, charge compensation is through oxygen vacancies which results in the brownmillerite-type structure. A defect model is proposed, which is consistent with experimental results. The solution energies for divalent and trivalent cations are also calculated. These results show that the cations having ionic radii less than 0.75 Å will occupy the Co sites and those with ionic radii larger than 0.75 Å will substitute at the Sm sites.
The Gradual Transformation of the Polish Public Science System
Heinecke, Steffi
2016-01-01
This paper investigates institutional change in the Polish public science system (PPSS) in the past twenty years. Employing macro-statistical data, the paper argues that this change process has unfolded stepwise and relatively late despite major political and economic transformations in post-socialist Poland. Using a historical-institutionalist perspective, the paper focuses on processes of institutional change, including layering, displacement, and dismantling. One major finding is that the speed and depth of the gradual transformation differs considerably between the three research performing sectors of the Polish public science system. As the Polish Academy of Sciences was reproduced institutionally, the former governmental units for applied R&D were partly dismantled and displaced by private sector R&D units. In contrast, the Higher Education sector underwent a strong expansion and, thus, layering of new research activities and fields. Since policy shifts within the PPSS occurred relatively late, the more than two decades following the collapse of communism are of special interest to scholars of incremental, yet cumulative, institutional change. PMID:27077386
Sampling-based ensemble segmentation against inter-operator variability
NASA Astrophysics Data System (ADS)
Huo, Jing; Okada, Kazunori; Pope, Whitney; Brown, Matthew
2011-03-01
Inconsistency and a lack of reproducibility are commonly associated with semi-automated segmentation methods. In this study, we developed an ensemble approach to improve reproducibility and applied it to glioblastoma multiforme (GBM) brain tumor segmentation on T1-weigted contrast enhanced MR volumes. The proposed approach combines samplingbased simulations and ensemble segmentation into a single framework; it generates a set of segmentations by perturbing user initialization and user-specified internal parameters, then fuses the set of segmentations into a single consensus result. Three combination algorithms were applied: majority voting, averaging and expectation-maximization (EM). The reproducibility of the proposed framework was evaluated by a controlled experiment on 16 tumor cases from a multicenter drug trial. The ensemble framework had significantly better reproducibility than the individual base Otsu thresholding method (p<.001).
Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian
2017-03-01
To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Gwang-Se; Cheong, Cheolung, E-mail: ccheong@pusan.ac.kr
Despite increasing concern about low-frequency noise of modern large horizontal-axis wind turbines (HAWTs), few studies have focused on its origin or its prediction methods. In this paper, infra- and low-frequency (the ILF) wind turbine noise are closely examined and an efficient method is developed for its prediction. Although most previous studies have assumed that the ILF noise consists primarily of blade passing frequency (BPF) noise components, these tonal noise components are seldom identified in the measured noise spectrum, except for the case of downwind wind turbines. In reality, since modern HAWTs are very large, during rotation, a single blade ofmore » the turbine experiences inflow with variation in wind speed in time as well as in space, breaking periodic perturbations of the BPF. Consequently, this transforms acoustic contributions at the BPF harmonics into broadband noise components. In this study, the ILF noise of wind turbines is predicted by combining Lowson’s acoustic analogy with the stochastic wind model, which is employed to reproduce realistic wind speed conditions. In order to predict the effects of these wind conditions on pressure variation on the blade surface, unsteadiness in the incident wind speed is incorporated into the XFOIL code by varying incident flow velocities on each blade section, which depend on the azimuthal locations of the rotating blade. The calculated surface pressure distribution is subsequently used to predict acoustic pressure at an observing location by using Lowson’s analogy. These predictions are compared with measured data, which ensures that the present method can reproduce the broadband characteristics of the measured low-frequency noise spectrum. Further investigations are carried out to characterize the IFL noise in terms of pressure loading on blade surface, narrow-band noise spectrum and noise maps around the turbine.« less
Methods for genetic transformation of filamentous fungi.
Li, Dandan; Tang, Yu; Lin, Jun; Cai, Weiwen
2017-10-03
Filamentous fungi have been of great interest because of their excellent ability as cell factories to manufacture useful products for human beings. The development of genetic transformation techniques is a precondition that enables scientists to target and modify genes efficiently and may reveal the function of target genes. The method to deliver foreign nucleic acid into cells is the sticking point for fungal genome modification. Up to date, there are some general methods of genetic transformation for fungi, including protoplast-mediated transformation, Agrobacterium-mediated transformation, electroporation, biolistic method and shock-wave-mediated transformation. This article reviews basic protocols and principles of these transformation methods, as well as their advantages and disadvantages.
Making Early Modern Medicine: Reproducing Swedish Bitters.
Ahnfelt, Nils-Otto; Fors, Hjalmar
2016-05-01
Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
Planar heterojunction perovskite solar cells with superior reproducibility
Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu
2014-01-01
Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945
Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.
2015-01-01
Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120
Xu, Kedong; Huang, Xiaohui; Wu, Manman; Wang, Yan; Chang, Yunxia; Liu, Kun; Zhang, Ju; Zhang, Yi; Zhang, Fuli; Yi, Liming; Li, Tingting; Wang, Ruiyue; Tan, Guangxuan; Li, Chengwei
2014-01-01
Transient transformation is simpler, more efficient and economical in analyzing protein subcellular localization than stable transformation. Fluorescent fusion proteins were often used in transient transformation to follow the in vivo behavior of proteins. Onion epidermis, which has large, living and transparent cells in a monolayer, is suitable to visualize fluorescent fusion proteins. The often used transient transformation methods included particle bombardment, protoplast transfection and Agrobacterium-mediated transformation. Particle bombardment in onion epidermis was successfully established, however, it was expensive, biolistic equipment dependent and with low transformation efficiency. We developed a highly efficient in planta transient transformation method in onion epidermis by using a special agroinfiltration method, which could be fulfilled within 5 days from the pretreatment of onion bulb to the best time-point for analyzing gene expression. The transformation conditions were optimized to achieve 43.87% transformation efficiency in living onion epidermis. The developed method has advantages in cost, time-consuming, equipment dependency and transformation efficiency in contrast with those methods of particle bombardment in onion epidermal cells, protoplast transfection and Agrobacterium-mediated transient transformation in leaf epidermal cells of other plants. It will facilitate the analysis of protein subcellular localization on a large scale.
Nyarko, Esmond B; Puzey, Kenneth A; Donnelly, Catherine W
2014-06-01
The objectives of this study were to determine if Fourier transform infrared (FT-IR) spectroscopy and multivariate statistical analysis (chemometrics) could be used to rapidly differentiate epidemic clones (ECs) of Listeria monocytogenes, as well as their intact compared with heat-killed populations. FT-IR spectra were collected from dried thin smears on infrared slides prepared from aliquots of 10 μL of each L. monocytogenes ECs (ECIII: J1-101 and R2-499; ECIV: J1-129 and J1-220), and also from intact and heat-killed cell populations of each EC strain using 250 scans at a resolution of 4 cm(-1) in the mid-infrared region in a reflectance mode. Chemometric analysis of spectra involved the application of the multivariate discriminant method for canonical variate analysis (CVA) and linear discriminant analysis (LDA). CVA of the spectra in the wavelength region 4000 to 600 cm(-1) separated the EC strains while LDA resulted in a 100% accurate classification of all spectra in the data set. Further, CVA separated intact and heat-killed cells of each EC strain and there was 100% accuracy in the classification of all spectra when LDA was applied. FT-IR spectral wavenumbers 1650 to 1390 cm(-1) were used to separate heat-killed and intact populations of L. monocytogenes. The FT-IR spectroscopy method allowed discrimination between strains that belong to the same EC. FT-IR is a highly discriminatory and reproducible method that can be used for the rapid subtyping of L. monocytogenes, as well as for the detection of live compared with dead populations of the organism. Fourier transform infrared (FT-IR) spectroscopy and multivariate statistical analysis can be used for L. monocytogenes source tracking and for clinical case isolate comparison during epidemiological investigations since the method is capable of differentiating epidemic clones and it uses a library of well-characterized strains. The FT-IR method is potentially less expensive and more rapid compared to genetic subtyping methods, and can be used for L. monocytogenes strain typing by food industries and public health agencies to enable faster response and intervention to listeriosis outbreaks. FT-IR can also be applied for routine monitoring of the pathogen in food processing plants and for investigating postprocessing contamination because it is capable of differentiating heat-killed and viable L. monocytogenes populations. © 2014 Institute of Food Technologists®
Grau, P; Vanrolleghem, P; Ayesa, E
2007-01-01
In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.
ACCURATE ORBITAL INTEGRATION OF THE GENERAL THREE-BODY PROBLEM BASED ON THE D'ALEMBERT-TYPE SCHEME
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minesaki, Yukitaka
2013-03-15
We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.
Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.
Reproducible, high performance patch antenna array apparatus and method of fabrication
Strassner, II, Bernd H.
2007-01-23
A reproducible, high-performance patch antenna array apparatus includes a patch antenna array provided on a unitary dielectric substrate, and a feed network provided on the same unitary substrate and proximity coupled to the patch antenna array. The reproducibility is enhanced by using photolithographic patterning and etching to produce both the patch antenna array and the feed network.
Reproducibility of dynamically represented acoustic lung images from healthy individuals
Maher, T M; Gat, M; Allen, D; Devaraj, A; Wells, A U; Geddes, D M
2008-01-01
Background and aim: Acoustic lung imaging offers a unique method for visualising the lung. This study was designed to demonstrate reproducibility of acoustic lung images recorded from healthy individuals at different time points and to assess intra- and inter-rater agreement in the assessment of dynamically represented acoustic lung images. Methods: Recordings from 29 healthy volunteers were made on three separate occasions using vibration response imaging. Reproducibility was measured using quantitative, computerised assessment of vibration energy. Dynamically represented acoustic lung images were scored by six blinded raters. Results: Quantitative measurement of acoustic recordings was highly reproducible with an intraclass correlation score of 0.86 (very good agreement). Intraclass correlations for inter-rater agreement and reproducibility were 0.61 (good agreement) and 0.86 (very good agreement), respectively. There was no significant difference found between the six raters at any time point. Raters ranged from 88% to 95% in their ability to identically evaluate the different features of the same image presented to them blinded on two separate occasions. Conclusion: Acoustic lung imaging is reproducible in healthy individuals. Graphic representation of lung images can be interpreted with a high degree of accuracy by the same and by different reviewers. PMID:18024534
Xanthopoulos, Emily; Hutchinson, Charles E; Adams, Judith E; Bruce, Ian N; Nash, Anthony F P; Holmes, Andrew P; Taylor, Christopher J; Waterton, John C
2007-01-01
Contrast-enhanced MRI is of value in assessing rheumatoid pannus in the hand, but the images are not always easy to quantitate. To develop and evaluate an improved measurement of volume of enhancing pannus (VEP) in the hand in human rheumatoid arthritis (RA). MR images of the hand and wrist were obtained for 14 patients with RA at 0, 1 and 13 weeks. Volume of enhancing pannus was measured on images created by subtracting precontrast T1-weighted images from contrast-enhanced T1-weighted images using a shuffle transformation technique. Maximum intensity projection (MIP) and 3D volume rendering of the images were used as a guide to identify the pannus and any contrast-enhanced veins. Visualisation of pannus was much improved following the shuffle transform. Between 0 weeks and 1 week, the mean value of the within-subject coefficient of variation (CoV) was 0.13 and the estimated total CoV was 0.15. There was no evidence of significant increased variability within the 13-week interval for the complete sample of patients. Volume of enhancing pannus can be measured reproducibly in the rheumatoid hand using 3D contrast-enhanced MRI and shuffle transform.
NO-producing compounds transform neuron responses to glutamate.
D'yakonova, T L
2000-01-01
We have previously shown that NO increases the excitatory effects of glutamate and blocks the desensitization of neurons to glutamate in the brain of the common snail. The aim of the present work was to identify the possible effect of NO on inhibitory responses to glutamate in the neurons of this mollusk. Electrophysiological investigations were performed on three identified neurons. The results showed that glutamate (0.05-0.1 mM) initially induced hyperpolarization and blocked the spike activity of these neurons. Simultaneous exposure to glutamate and the NO donor nitroprusside or preincubation with an NO donor had the effect that cells again responded to glutamate with depolarization and excitation. The transformed excitatory response lasted several minutes and could be reproduced even after 24 h of washing. The NO synthase blocker monomethylarginine blocked the excitatory response to glutamate. Another agonist of glutamate receptors, N-methyl-D-aspartate (NMDA, 0.1-1 mM), initially had excitatory effects on these neurons; this effect was significantly enhanced after transformation of the response to glutamate by NO donors. The results obtained here show that NO is involved in transforming the inhibitory responses to glutamate to excitatory responses, and that this effect may be mediated by NMDA-type receptors.
Box-Cox transformation for QTL mapping.
Yang, Runqing; Yi, Nengjun; Xu, Shizhong
2006-01-01
The maximum likelihood method of QTL mapping assumes that the phenotypic values of a quantitative trait follow a normal distribution. If the assumption is violated, some forms of transformation should be taken to make the assumption approximately true. The Box-Cox transformation is a general transformation method which can be applied to many different types of data. The flexibility of the Box-Cox transformation is due to a variable, called transformation factor, appearing in the Box-Cox formula. We developed a maximum likelihood method that treats the transformation factor as an unknown parameter, which is estimated from the data simultaneously along with the QTL parameters. The method makes an objective choice of data transformation and thus can be applied to QTL analysis for many different types of data. Simulation studies show that (1) Box-Cox transformation can substantially increase the power of QTL detection; (2) Box-Cox transformation can replace some specialized transformation methods that are commonly used in QTL mapping; and (3) applying the Box-Cox transformation to data already normally distributed does not harm the result.
NASA Astrophysics Data System (ADS)
Tomita, Motohiro; Ogasawara, Masataka; Terada, Takuya; Watanabe, Takanobu
2018-04-01
We provide the parameters of Stillinger-Weber potentials for GeSiSn ternary mixed systems. These parameters can be used in molecular dynamics (MD) simulations to reproduce phonon properties and thermal conductivities. The phonon dispersion relation is derived from the dynamical structure factor, which is calculated by the space-time Fourier transform of atomic trajectories in an MD simulation. The phonon properties and thermal conductivities of GeSiSn ternary crystals calculated using these parameters mostly reproduced both the findings of previous experiments and earlier calculations made using MD simulations. The atomic composition dependence of these properties in GeSiSn ternary crystals obtained by previous studies (both experimental and theoretical) and the calculated data were almost exactly reproduced by our proposed parameters. Moreover, the results of the MD simulation agree with the previous calculations made using a time-independent phonon Boltzmann transport equation with complicated scattering mechanisms. These scattering mechanisms are very important in complicated nanostructures, as they allow the heat-transfer properties to be more accurately calculated by MD simulations. This work enables us to predict the phonon- and heat-related properties of bulk group IV alloys, especially ternary alloys.
van der Leij, Christiaan; Lavini, Cristina; van de Sande, Marleen G H; de Hair, Marjolein J H; Wijffels, Christophe; Maas, Mario
2015-12-01
To compare the between-session reproducibility of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) combined with time-intensity curve (TIC)-shape analysis in arthritis patients, within one scanner and between two different scanners, and to compare this method with qualitative analysis and pharmacokinetic modeling (PKM). Fifteen knee joint arthritis patients were included and scanned twice on a closed-bore 1.5T scanner (n = 9, group 1), or on a closed-bore 1.5T and on an open-bore 1.0T scanner (n = 6, group 2). DCE-MRI data were postprocessed using in-house developed software ("Dynamo"). Disease activity was assessed. Disease activity was comparable between the two visits. In group 1 qualitative analysis showed the highest reproducibility with intraclass correlation coefficients (ICCs) between 0.78 and 0.98 and root mean square-coefficients of variation (RMS-CoV) of 8.0%-14.9%. TIC-shape analysis showed a slightly lower reproducibility with similar ICCs (0.78-0.97) but higher RMS-CoV (18.3%-42.9%). The PKM analysis showed the lowest reproducibility with ICCs between 0.39 and 0.64 (RMS-CoV 21.5%-51.9%). In group 2 TIC-shape analysis of the two most important TIC-shape types showed the highest reproducibility with ICCs of 0.78 and 0.71 (RMS-CoV 29.8% and 59.4%) and outperformed the reproducibility of the most important qualitative parameter (ICC 0.31, RMS-CoV 45.1%) and the within-scanner reproducibility of PKM analysis. TIC-shape analysis is a robust postprocessing method within one scanner, almost as reproducible as the qualitative analysis. Between scanners, the reproducibility of the most important TIC-shapes outperform that of the most important qualitative parameter and the within-scanner reproducibility of PKM analysis. © 2015 Wiley Periodicals, Inc.
Tissue-scale, personalized modeling and simulation of prostate cancer growth
NASA Astrophysics Data System (ADS)
Lorenzo, Guillermo; Scott, Michael A.; Tew, Kevin; Hughes, Thomas J. R.; Zhang, Yongjie Jessica; Liu, Lei; Vilanova, Guillermo; Gomez, Hector
2016-11-01
Recently, mathematical modeling and simulation of diseases and their treatments have enabled the prediction of clinical outcomes and the design of optimal therapies on a personalized (i.e., patient-specific) basis. This new trend in medical research has been termed “predictive medicine.” Prostate cancer (PCa) is a major health problem and an ideal candidate to explore tissue-scale, personalized modeling of cancer growth for two main reasons: First, it is a small organ, and, second, tumor growth can be estimated by measuring serum prostate-specific antigen (PSA, a PCa biomarker in blood), which may enable in vivo validation. In this paper, we present a simple continuous model that reproduces the growth patterns of PCa. We use the phase-field method to account for the transformation of healthy cells to cancer cells and use diffusion-reaction equations to compute nutrient consumption and PSA production. To accurately and efficiently compute tumor growth, our simulations leverage isogeometric analysis (IGA). Our model is shown to reproduce a known shape instability from a spheroidal pattern to fingered growth. Results of our computations indicate that such shift is a tumor response to escape starvation, hypoxia, and, eventually, necrosis. Thus, branching enables the tumor to minimize the distance from inner cells to external nutrients, contributing to cancer survival and further development. We have also used our model to perform tissue-scale, personalized simulation of a PCa patient, based on prostatic anatomy extracted from computed tomography images. This simulation shows tumor progression similar to that seen in clinical practice.
ERIC Educational Resources Information Center
Grimm, C. A.
This document contains two units that examine integral transforms and series expansions. In the first module, the user is expected to learn how to use the unified method presented to obtain Laplace transforms, Fourier transforms, complex Fourier series, real Fourier series, and half-range sine series for given piecewise continuous functions. In…
Telfer, Scott; Gibson, Kellie S; Hennessy, Kym; Steultjens, Martijn P; Woodburn, Jim
2012-05-01
To determine, for a number of techniques used to obtain foot shape based around plaster casting, foam box impressions, and 3-dimensional scanning, (1) the effect the technique has on the overall reproducibility of custom foot orthoses (FOs) in terms of inter- and intracaster reliability and (2) the reproducibility of FO design by using computer-aided design (CAD) software in terms of inter- and intra-CAD operator reliability for all these techniques. Cross-sectional study. University laboratory. Convenience sample of individuals (N=22) with noncavus foot types. Not applicable. Parameters of the FO design (length, width at forefoot, width at rearfoot, and peak medial arch height), the forefoot to rearfoot angle of the foot shape, and overall volume match between device designs. For intra- and intercaster reliability of the different methods of obtaining the foot shape, all methods fell below the reproducibility quality threshold for the medial arch height of the device, and volume matching was <80% for all methods. The more experienced CAD operator was able to achieve excellent reliability (intraclass correlation coefficients >0.75) for all variables with the exception of forefoot to rearfoot angle, with overall volume matches of >87% of the devices. None of the techniques for obtaining foot shape met all the criteria for excellent reproducibility, with the peak arch height being particularly variable. Additional variability is added at the CAD stage of the FO design process, although with adequate operator experience good to excellent reproducibility may be achieved at this stage. Taking only basic linear or angular measurement parameters from the device may fail to fully capture the variability in FO design. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Emre, Baris; Yüce, Süheyla; Stern-Taulats, Enric; Planes, Antoni; Fabbrici, Simone; Albertini, Franca; Mañosa, Lluís
2013-06-01
Calorimetry under magnetic field has been used to study the inverse magnetocaloric effect in Ni-Co-Mn-Ga-In magnetic shape memory alloys. It is shown that the energy dissipated during a complete transformation loop only represents a small fraction (5% to 7%) of the latent heat of the martensitic transition. It is found that the entropy values obtained from isofield temperature scans agree well with those obtained from isothermal magnetic field scans. The reproducibility of the magnetocaloric effect has been studied from isothermal measurements. Reproducible entropy values under field cycling have been found within a temperature interval bounded by the start temperature of the forward transition at zero field and the start temperature of the reverse transition under applied field. Large reversible entropy changes around 11 J/kg K have been found for fields up to 6 T.
Zhang, Liqin; Yan, Ye; Han, Cha; Xue, Fengxia
2018-01-01
Objective To evaluate the diagnostic accuracy of the 2011 International Federation for Cervical Pathology and Colposcopy (IFCPC) colposcopic terminology. Methods The clinicopathological data of 2262 patients who underwent colposcopy from September 2012 to September 2016 were reviewed. The colposcopic findings, colposcopic impression, and cervical histopathology of the patients were analyzed. Correlations between variables were evaluated using cervical histopathology as the gold standard. Results Colposcopic diagnosis matched biopsy histopathology in 1482 patients (65.5%), and the weighted kappa strength of agreement was 0.480 (P<0.01). Colposcopic diagnoses more often underestimated (22.1%) than overestimated (12.3%) cervical pathology. There was no significant difference between the colposcopic diagnosis and cervical pathology agreement among the various grades of lesions (P=0.282). The sensitivity, specificity for detecting high-grade lesions/carcinoma was 71.6% and 98.0%, respectively. Multivariate analysis showed that major changes were independent factors in predicting high-grade lesion/carcinoma, whereas transformation zone, lesion size, and non-stained were not statistically related to high-grade lesion/carcinoma. Conclusions The 2011 IFCPC terminology can improve the diagnostic accuracy for all lesion severities. The categorization of major changes and minor changes is appropriate. However, colposcopic diagnosis remains unsatisfactory. Poor reproducibility of type 2 transformation zone and the significance of leukoplakia require further study. PMID:29507681
Synthesis and thermal stability of zirconia and yttria-stabilized zirconia microspheres.
Leib, Elisabeth W; Vainio, Ulla; Pasquarelli, Robert M; Kus, Jonas; Czaschke, Christian; Walter, Nils; Janssen, Rolf; Müller, Martin; Schreyer, Andreas; Weller, Horst; Vossmeyer, Tobias
2015-06-15
Zirconia microparticles produced by sol-gel synthesis have great potential for photonic applications. To this end, identifying synthetic methods that yield reproducible control over size uniformity is important. Phase transformations during thermal cycling can disintegrate the particles. Therefore, understanding the parameters driving these transformations is essential for enabling high-temperature applications. Particle morphology is expected to influence particle processability and stability. Yttria-doping should improve the thermal stability of the particles, as it does in bulk zirconia. Zirconia and YSZ particles were synthesized by improved sol-gel approaches using fatty acid stabilizers. The particles were heated to 1500 °C, and structural and morphological changes were monitored by SEM, ex situ XRD and high-energy in situ XRD. Zirconia particles (0.4-4.3 μm in diameter, 5-10% standard deviation) synthesized according to the modified sol-gel approaches yielded significantly improved monodispersities. As-synthesized amorphous particles transformed to the tetragonal phase at ∼450 °C with a volume decrease of up to ∼75% and then to monoclinic after heating from ∼650 to 850 °C. Submicron particles disintegrated at ∼850 °C and microparticles at ∼1200 °C due to grain growth. In situ XRD revealed that the transition from the amorphous to tetragonal phase was accompanied by relief in microstrain and the transition from tetragonal to monoclinic was correlated with the tetragonal grain size. Early crystallization and smaller initial grain sizes, which depend on the precursors used for particle synthesis, coincided with higher stability. Yttria-doping reduced grain growth, stabilized the tetragonal phase, and significantly improved the thermal stability of the particles. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Composting in small laboratory pilots: performance and reproducibility.
Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S
2012-02-01
Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. Copyright © 2011 Elsevier Ltd. All rights reserved.
Xu, Kedong; Huang, Xiaohui; Wu, Manman; Wang, Yan; Chang, Yunxia; Liu, Kun; Zhang, Ju; Zhang, Yi; Zhang, Fuli; Yi, Liming; Li, Tingting; Wang, Ruiyue; Tan, Guangxuan; Li, Chengwei
2014-01-01
Transient transformation is simpler, more efficient and economical in analyzing protein subcellular localization than stable transformation. Fluorescent fusion proteins were often used in transient transformation to follow the in vivo behavior of proteins. Onion epidermis, which has large, living and transparent cells in a monolayer, is suitable to visualize fluorescent fusion proteins. The often used transient transformation methods included particle bombardment, protoplast transfection and Agrobacterium-mediated transformation. Particle bombardment in onion epidermis was successfully established, however, it was expensive, biolistic equipment dependent and with low transformation efficiency. We developed a highly efficient in planta transient transformation method in onion epidermis by using a special agroinfiltration method, which could be fulfilled within 5 days from the pretreatment of onion bulb to the best time-point for analyzing gene expression. The transformation conditions were optimized to achieve 43.87% transformation efficiency in living onion epidermis. The developed method has advantages in cost, time-consuming, equipment dependency and transformation efficiency in contrast with those methods of particle bombardment in onion epidermal cells, protoplast transfection and Agrobacterium-mediated transient transformation in leaf epidermal cells of other plants. It will facilitate the analysis of protein subcellular localization on a large scale. PMID:24416168
Xu, Wenjun; Tang, Chen; Gu, Fan; Cheng, Jiajia
2017-04-01
It is a key step to remove the massive speckle noise in electronic speckle pattern interferometry (ESPI) fringe patterns. In the spatial-domain filtering methods, oriented partial differential equations have been demonstrated to be a powerful tool. In the transform-domain filtering methods, the shearlet transform is a state-of-the-art method. In this paper, we propose a filtering method for ESPI fringe patterns denoising, which is a combination of second-order oriented partial differential equation (SOOPDE) and the shearlet transform, named SOOPDE-Shearlet. Here, the shearlet transform is introduced into the ESPI fringe patterns denoising for the first time. This combination takes advantage of the fact that the spatial-domain filtering method SOOPDE and the transform-domain filtering method shearlet transform benefit from each other. We test the proposed SOOPDE-Shearlet on five experimentally obtained ESPI fringe patterns with poor quality and compare our method with SOOPDE, shearlet transform, windowed Fourier filtering (WFF), and coherence-enhancing diffusion (CEDPDE). Among them, WFF and CEDPDE are the state-of-the-art methods for ESPI fringe patterns denoising in transform domain and spatial domain, respectively. The experimental results have demonstrated the good performance of the proposed SOOPDE-Shearlet.
Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi
2016-01-01
Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733
Aggarwal, Priya; Gupta, Anubha
2017-12-01
A number of reconstruction methods have been proposed recently for accelerated functional Magnetic Resonance Imaging (fMRI) data collection. However, existing methods suffer with the challenge of greater artifacts at high acceleration factors. This paper addresses the issue of accelerating fMRI collection via undersampled k-space measurements combined with the proposed method based on l 1 -l 1 norm constraints, wherein we impose first l 1 -norm sparsity on the voxel time series (temporal data) in the transformed domain and the second l 1 -norm sparsity on the successive difference of the same temporal data. Hence, we name the proposed method as Double Temporal Sparsity based Reconstruction (DTSR) method. The robustness of the proposed DTSR method has been thoroughly evaluated both at the subject level and at the group level on real fMRI data. Results are presented at various acceleration factors. Quantitative analysis in terms of Peak Signal-to-Noise Ratio (PSNR) and other metrics, and qualitative analysis in terms of reproducibility of brain Resting State Networks (RSNs) demonstrate that the proposed method is accurate and robust. In addition, the proposed DTSR method preserves brain networks that are important for studying fMRI data. Compared to the existing methods, the DTSR method shows promising potential with an improvement of 10-12 dB in PSNR with acceleration factors upto 3.5 on resting state fMRI data. Simulation results on real data demonstrate that DTSR method can be used to acquire accelerated fMRI with accurate detection of RSNs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Characterization of Residues from the Detonation of Insensitive Munitions
Unfortunately, many energetic compounds are toxic or harmful to the environment and human health. The US Army Cold Regions Research and Engineering...Laboratory and Defence Research and Development Canada Valcartier have developed methods through SERDP and ESTCP programs that enable the reproducible...reproducible method for energetics residues characterization research . SERDP Project ER-2219 is focused on three areas: determining mass DEPOSITION and
Azuaje, Francisco; Zheng, Huiru; Camargo, Anyela; Wang, Haiying
2011-08-01
The discovery of novel disease biomarkers is a crucial challenge for translational bioinformatics. Demonstration of both their classification power and reproducibility across independent datasets are essential requirements to assess their potential clinical relevance. Small datasets and multiplicity of putative biomarker sets may explain lack of predictive reproducibility. Studies based on pathway-driven discovery approaches have suggested that, despite such discrepancies, the resulting putative biomarkers tend to be implicated in common biological processes. Investigations of this problem have been mainly focused on datasets derived from cancer research. We investigated the predictive and functional concordance of five methods for discovering putative biomarkers in four independently-generated datasets from the cardiovascular disease domain. A diversity of biosignatures was identified by the different methods. However, we found strong biological process concordance between them, especially in the case of methods based on gene set analysis. With a few exceptions, we observed lack of classification reproducibility using independent datasets. Partial overlaps between our putative sets of biomarkers and the primary studies exist. Despite the observed limitations, pathway-driven or gene set analysis can predict potentially novel biomarkers and can jointly point to biomedically-relevant underlying molecular mechanisms. Copyright © 2011 Elsevier Inc. All rights reserved.
Sachdeva, Veena; Hooda, Vinita
2015-08-01
Epoxy glued silver nanoparticles were used as immobilization support for nitrate reductase (NR). The resulting epoxy/AgNPs/NR conjugates were characterized at successive stages of fabrication by scanning electron microscopy and fourier transform infrared spectroscopy. The immobilized enzyme system exhibited reasonably high conjugation yield (37.6±0.01 μg/cm(2)), with 93.54±0.88% retention of specific activity. Most favorable working conditions of pH, temperature and substrate concentration were ascertained to optimize the performance of epoxy/AgNPs/NR conjugates for soil nitrate quantification. The analytical results for soil nitrate determination were consistent, reliable and reproducible. Minimum detection limit of the method was 0.05 mM with linearity from 0.1 to 11.0 mM. The % recoveries of added nitrates (0.1 and 0.2 mM) were<95.0% and within-day and between-day coefficients of variations were 0.556% and 1.63% respectively. The method showed good correlation (R(2)=0.998) with the popular Griess reaction method. Epoxy/AgNPs bound NR had a half-life of 18 days at 4 °C and retained 50% activity after 15 reuses. Copyright © 2015 Elsevier B.V. All rights reserved.
Siddiqui, M F; Reza, A W; Kanesan, J; Ramiah, H
2014-01-01
A wide interest has been observed to find a low power and area efficient hardware design of discrete cosine transform (DCT) algorithm. This research work proposed a novel Common Subexpression Elimination (CSE) based pipelined architecture for DCT, aimed at reproducing the cost metrics of power and area while maintaining high speed and accuracy in DCT applications. The proposed design combines the techniques of Canonical Signed Digit (CSD) representation and CSE to implement the multiplier-less method for fixed constant multiplication of DCT coefficients. Furthermore, symmetry in the DCT coefficient matrix is used with CSE to further decrease the number of arithmetic operations. This architecture needs a single-port memory to feed the inputs instead of multiport memory, which leads to reduction of the hardware cost and area. From the analysis of experimental results and performance comparisons, it is observed that the proposed scheme uses minimum logic utilizing mere 340 slices and 22 adders. Moreover, this design meets the real time constraints of different video/image coders and peak-signal-to-noise-ratio (PSNR) requirements. Furthermore, the proposed technique has significant advantages over recent well-known methods along with accuracy in terms of power reduction, silicon area usage, and maximum operating frequency by 41%, 15%, and 15%, respectively.
Rezvani, Seyyed Ahmad; Soleymanpour, Ahmad
2016-03-04
A very convenient, sensitive and precise solid phase extraction (SPE) system was developed for enrichment and determination of ultra-trace of cadmium ion in water and plant samples. This method was based on the retention of cadmium(II) ions by l-cystine adsorbed in Y-zeolite and carry out in a packed mini-column. The retained cadmium ions then were eluted and determined by flame atomic absorption spectrometry. The scanning electron microscopy (SEM), powder X-ray diffraction (XRD) and Fourier Transform Infrared (FT-IR) spectroscopy techniques were applied for the characterization of cystine modified zeolite (CMZ). Some experimental conditions affecting the analytical performance such as pH, eluent type, concentration of sample, eluent flow rate and also the presence of interfering ions were investigated. The calibration graph was linear within the range of 0.1-7.5ngmL(-1) and limit of detection was obtained 0.04ngmL(-1) with the preconcentration factor of 400. The relative standard deviation (RSD) was obtained 1.4%, indicating the excellent reproducibility of this method. The proposed method was successfully applied for the extraction and determination of cadmium(II) ion in black tea, cigarette's tobacco and also various water samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Pattern recall skills of talented soccer players: Two new methods applied.
van Maarseveen, Mariëtte J J; Oudejans, Raôul R D; Savelsbergh, Geert J P
2015-06-01
In this study we analyzed the pattern recall skills of talented soccer players by means of two innovative methods of analysis and gaze behavior data. Twenty-two young female soccer players watched video clips of 3 vs. 3 small-sided games and, after occlusion, had to reproduce the positions of the players. Recall performance was measured by calculating the spatial error of the recalled player positions at the moment of occlusion and at consecutive 33ms increments. We analyzed player positions relative to each other, by assessing geometric pattern features in terms of angles between players, and we transformed the data into real-world coordinates to exclude the effects of the 2D perspective in the video clips. The results showed that the participants anticipated the movements of the patterns. In real-world coordinates, the more experienced players anticipated the pattern further in advance than the less experienced players and demonstrated a higher search rate, a shorter fixation duration and a higher fixation order. The differences in recall accuracy between the defensive and offensive elements were not consistent across the methods of analysis and, therefore, we propose that perspective effects of the video clip should be taken into account in further research. Copyright © 2015 Elsevier B.V. All rights reserved.
Improved analytical techniques of sulfur isotopic composition in nanomole quantities by MC-ICP-MS.
Yu, Tsai-Luen; Wang, Bo-Shian; Shen, Chuan-Chou; Wang, Pei-Ling; Yang, Tsanyao Frank; Burr, George S; Chen, Yue-Gau
2017-10-02
We propose an improved method for precise sulfur isotopic measurements by multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS) in conjunction with a membrane desolvation nebulization system. The problems of sulfur loss through the membrane desolvation apparatus are carefully quantified and resolved. The method overcomes low intrinsic sulfur transmission through the instrument, which was initially 1% when operating at a desolvation temperature of 160 °C. Sulfur loss through the membrane desolvation apparatus was resolved by doping with sodium. A Na/S ratio of 2 mol mol -1 produced sulfur transmissions with 98% recovery. Samples of 3 nmol (100 ng) sulfur achieved an external precision of ±0.18‰ (2 SD) for δ 34 S and ±0.10‰ (2 SD) for Δ 33 S (uppercase delta expresses the extent of mass-independent isotopic fractionation). Measurements made on certified reference materials and in-house standards demonstrate analytical accuracy and reproducibility. We applied the method to examine microbial-induced sulfur transformation in marine sediment pore waters from the sulfate-methane transition zone. The technique is quite versatile, and can be applied to a range of materials, including natural waters and minerals. Copyright © 2017 Elsevier B.V. All rights reserved.
Siddiqui, M. F.; Reza, A. W.; Kanesan, J.; Ramiah, H.
2014-01-01
A wide interest has been observed to find a low power and area efficient hardware design of discrete cosine transform (DCT) algorithm. This research work proposed a novel Common Subexpression Elimination (CSE) based pipelined architecture for DCT, aimed at reproducing the cost metrics of power and area while maintaining high speed and accuracy in DCT applications. The proposed design combines the techniques of Canonical Signed Digit (CSD) representation and CSE to implement the multiplier-less method for fixed constant multiplication of DCT coefficients. Furthermore, symmetry in the DCT coefficient matrix is used with CSE to further decrease the number of arithmetic operations. This architecture needs a single-port memory to feed the inputs instead of multiport memory, which leads to reduction of the hardware cost and area. From the analysis of experimental results and performance comparisons, it is observed that the proposed scheme uses minimum logic utilizing mere 340 slices and 22 adders. Moreover, this design meets the real time constraints of different video/image coders and peak-signal-to-noise-ratio (PSNR) requirements. Furthermore, the proposed technique has significant advantages over recent well-known methods along with accuracy in terms of power reduction, silicon area usage, and maximum operating frequency by 41%, 15%, and 15%, respectively. PMID:25133249
Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.
Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian
2018-03-26
In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Tsai, Ming-Rung; Chiu, Yu-Wei; Lo, Men Tzung; Sun, Chi-Kuang
2010-03-01
Atrial fibrillation (AF) is the most common irregular heart rhythm and the mortality rate for patients with AF is approximately twice the mortality rate for patients with normal sinus rhythm (NSR). Some research has indicated that myocardial fibrosis plays an important role in predisposing patients to AF. Therefore, realizing the relationship between myocardial collagen fibrosis and AF is significant. Second-harmonic generation (SHG) is an optically nonlinear coherent process to image the collagen network. We perform SHG microscopic imaging of the collagen fibers in the human atrial myocardium. Utilizing the SHG images, we can identify the differences in morphology and the arrangement of collagen fibers between NSR and AF tissues. We also quantify the arrangement of the collagen fibers using Fourier transform images and calculating the values of angle entropy. We indicate that SHG imaging, a nondestructive and reproducible method to analyze the arrangement of collagen fibers, can provide explicit information about the relationship between myocardial fibrosis and AF.
Simple mass production of zinc oxide nanostructures via low-temperature hydrothermal synthesis
NASA Astrophysics Data System (ADS)
Ghasaban, Samaneh; Atai, Mohammad; Imani, Mohammad
2017-03-01
The specific properties of zinc oxide (ZnO) nanoparticles have attracted much attention within the scientific community as a useful material for biomedical applications. Hydrothermal synthesis is known as a useful method to produce nanostructures with certain particle size and morphology however, scaling up the reaction is still a challenging task. In this research, large scale hydrothermal synthesis of ZnO nanostructures (60 g) was performed in a 5 l stainless steel autoclave by reaction between anionic (ammonia or sodium hydroxide) and cationic (zinc acetate dehydrate) precursors in low temperature. Hydrothermal reaction temperature and time were decreased to 115 °C and 2 or 6 h. In batch repetitions, the same morphologies (plate- and needle-like) with reproducible particle size were obtained. The nanostructures formed were analyzed by powder x-ray diffraction, Fourier-transform infrared spectroscopy, energy dispersive x-ray analysis, scanning electron microscopy and BET analysis. The nanostructures formed were antibacterially active against Staphylococcus aureus.
NASA Astrophysics Data System (ADS)
Radtke, J.; Sponner, J.; Jakobi, C.; Schneider, J.; Sommer, M.; Teichmann, T.; Ullrich, W.; Henniger, J.; Kormoll, T.
2018-01-01
Single photon detection applied to optically stimulated luminescence (OSL) dosimetry is a promising approach due to the low level of luminescence light and the known statistical behavior of single photon events. Time resolved detection allows to apply a variety of different and independent data analysis methods. Furthermore, using amplitude modulated stimulation impresses time- and frequency information into the OSL light and therefore allows for additional means of analysis. Considering the impressed frequency information, data analysis by using Fourier transform algorithms or other digital filters can be used for separating the OSL signal from unwanted light or events generated by other phenomena. This potentially lowers the detection limits of low dose measurements and might improve the reproducibility and stability of obtained data. In this work, an OSL system based on a single photon detector, a fast and accurate stimulation unit and an FPGA is presented. Different analysis algorithms which are applied to the single photon data are discussed.
A method to identify and analyze biological programs through automated reasoning
Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen
2016-01-01
Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090
An Automated Classification Technique for Detecting Defects in Battery Cells
NASA Technical Reports Server (NTRS)
McDowell, Mark; Gray, Elizabeth
2006-01-01
Battery cell defect classification is primarily done manually by a human conducting a visual inspection to determine if the battery cell is acceptable for a particular use or device. Human visual inspection is a time consuming task when compared to an inspection process conducted by a machine vision system. Human inspection is also subject to human error and fatigue over time. We present a machine vision technique that can be used to automatically identify defective sections of battery cells via a morphological feature-based classifier using an adaptive two-dimensional fast Fourier transformation technique. The initial area of interest is automatically classified as either an anode or cathode cell view as well as classified as an acceptable or a defective battery cell. Each battery cell is labeled and cataloged for comparison and analysis. The result is the implementation of an automated machine vision technique that provides a highly repeatable and reproducible method of identifying and quantifying defects in battery cells.
NASA Astrophysics Data System (ADS)
Oluwaniyi, Omolara O.; Adegoke, Haleemat I.; Adesuji, Elijah T.; Alabi, Aderemi B.; Bodede, Sunday O.; Labulo, Ayomide H.; Oseghale, Charles O.
2016-08-01
Biosynthesizing of silver nanoparticles using microorganisms or various plant parts have proven more environmental friendly, cost-effective, energy saving and reproducible when compared to chemical and physical methods. This investigation demonstrated the plant-mediated synthesis of silver nanoparticles using the aqueous leaf extract of Thevetia peruviana. UV-Visible spectrophotometer was used to measure the surface plasmon resonance of the nanoparticles at 460 nm. Fourier Transform Infrared showed that the glycosidic -OH and carbonyl functional group present in extract were responsible for the reduction and stabilization of the silver nanoparticles. X ray diffraction, Scanning Electron Microscopy, Transmission Electron Microscopy and Selected Area Electron Diffraction analyses were used to confirm the nature, morphology and shape of the nanoparticles. The silver nanoparticles are spherical in shape with average size of 18.1 nm. The synthesized silver nanoparticles showed activity against fungal pathogens and bacteria. The zone of inhibition observed in the antimicrobial study ranged between 10 and 20 mm.
A call for virtual experiments: accelerating the scientific process.
Cooper, Jonathan; Vik, Jon Olav; Waltemath, Dagmar
2015-01-01
Experimentation is fundamental to the scientific method, whether for exploration, description or explanation. We argue that promoting the reuse of virtual experiments (the in silico analogues of wet-lab or field experiments) would vastly improve the usefulness and relevance of computational models, encouraging critical scrutiny of models and serving as a common language between modellers and experimentalists. We review the benefits of reusable virtual experiments: in specifying, assaying, and comparing the behavioural repertoires of models; as prerequisites for reproducible research; to guide model reuse and composition; and for quality assurance in the translational application of models. A key step towards achieving this is that models and experimental protocols should be represented separately, but annotated so as to facilitate the linking of models to experiments and data. Lastly, we outline how the rigorous, streamlined confrontation between experimental datasets and candidate models would enable a "continuous integration" of biological knowledge, transforming our approach to systems biology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rangreez, Tauseef Ahmad; Alhogbi, Basma G.; Naushad, Mu.
2017-01-01
In this study, graphene Th(IV) phosphate was prepared by sol–gel precipitation method. The ion-exchange behavior of this cation-exchanger was studied by investigating properties like ion-exchange capacity for various metal ions, the effect of eluent concentration, elution behavior, and thermal effect on ion-exchange capacity (IEC). Several physicochemical properties as Fourier transform infrared (FTIR) spectroscopy, X-ray diffraction (XRD) study, thermal studies, scanning electron microscopy (SEM) and transmission electron microscopy (TEM) studies were also carried out. The material possessed an IEC of 1.56 meq·dry·g−1 of the exchanger and was found to be nano-composite. The selectivity studies showed that the material is selective towards Pb(II) ions. The selectivity of this cation-exchanger was demonstrated in the binary separation of Pb(II) ions from mixture with other metal ions. The recovery was found to be both quantitative and reproducible. PMID:28737717
High- and Reproducible-Performance Graphene/II-VI Semiconductor Film Hybrid Photodetectors
Huang, Fan; Jia, Feixiang; Cai, Caoyuan; Xu, Zhihao; Wu, Congjun; Ma, Yang; Fei, Guangtao; Wang, Min
2016-01-01
High- and reproducible-performance photodetectors are critical to the development of many technologies, which mainly include one-dimensional (1D) nanostructure based and film based photodetectors. The former suffer from a huge performance variation because the performance is quite sensitive to the synthesis microenvironment of 1D nanostructure. Herein, we show that the graphene/semiconductor film hybrid photodetectors not only possess a high performance but also have a reproducible performance. As a demo, the as-produced graphene/ZnS film hybrid photodetector shows a high responsivity of 1.7 × 107 A/W and a fast response speed of 50 ms, and shows a highly reproducible performance, in terms of narrow distribution of photocurrent (38–65 μA) and response speed (40–60 ms) for 20 devices. Graphene/ZnSe film and graphene/CdSe film hybrid photodetectors fabricated by this method also show a high and reproducible performance. The general method is compatible with the conventional planar process, and would be easily standardized and thus pay a way for the photodetector applications. PMID:27349692
The Coordinate Transformation Method of High Resolution dem Data
NASA Astrophysics Data System (ADS)
Yan, Chaode; Guo, Wang; Li, Aimin
2018-04-01
Coordinate transformation methods of DEM data can be divided into two categories. One reconstruct based on original vector elevation data. The other transforms DEM data blocks by transforming parameters. But the former doesn't work in the absence of original vector data, and the later may cause errors at joint places between adjoining blocks of high resolution DEM data. In view of this problem, a method dealing with high resolution DEM data coordinate transformation is proposed. The method transforms DEM data into discrete vector elevation points, and then adjusts positions of points by bi-linear interpolation respectively. Finally, a TIN is generated by transformed points, and the new DEM data in target coordinate system is reconstructed based on TIN. An algorithm which can find blocks and transform automatically is given in this paper. The method is tested in different terrains and proved to be feasible and valid.
S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.
NASA Astrophysics Data System (ADS)
Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.
2017-12-01
The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and knowledge transfer events, will be discussed in the context of an existing data-intensive service for seismology (VERCE), and the newly funded project DARE - Delivering Agile Research Excellence. A generic solution for extreme data and methods in geosciences that domain experts can understand, change and use effectively.
Tracking maize pollen development by the Leaf Collar Method.
Begcy, Kevin; Dresselhaus, Thomas
2017-12-01
An easy and highly reproducible nondestructive method named the Leaf Collar Method is described to identify and characterize the different stages of pollen development in maize. In plants, many cellular events such as meiosis, asymmetric cell division, cell cycle regulation, cell fate determination, nucleus movement, vacuole formation, chromatin condensation and epigenetic modifications take place during pollen development. In maize, pollen development occurs in tassels that are confined within the internal stalk of the plant. Hence, identification of the different pollen developmental stages as a tool to investigate above biological processes is impossible without dissecting the entire plant. Therefore, an efficient and reproducible method is necessary to isolate homogeneous cell populations at individual stages throughout pollen development without destroying the plant. Here, we describe a method to identify the various stages of pollen development in maize. Using the Leaf Collar Method in the maize inbreed line B73, we have determined the duration of each stage from pollen mother cells before meiosis to mature tricellular pollen. Anther and tassel size as well as percentage of pollen stages were correlated with vegetative stages, which are easily recognized. The identification of stage-specific genes indicates the reproducibility of the method. In summary, we present an easy and highly reproducible nondestructive method to identify and characterize the different stages of pollen development in maize. This method now opens the way for many subsequent physiological, morphological and molecular analyses to study, for instance, transcriptomics, metabolomics, DNA methylation and chromatin patterns during normal and stressful conditions throughout pollen development in one of the economically most important grass species.
Cupping - is it reproducible? Experiments about factors determining the vacuum.
Huber, R; Emerich, M; Braeunig, M
2011-04-01
Cupping is a traditional method for treating pain which is investigated nowadays in clinical studies. Because the methods for producing the vacuum vary considerably we tested their reproducibility. In a first set of experiments (study 1) four methods for producing the vacuum (lighter flame 2 cm (LF1), lighter flame 4 cm (LF2), alcohol flame (AF) and mechanical suction with a balloon (BA)) have been compared in 50 trials each. The cupping glass was prepared with an outlet and stop-cock, the vacuum was measured with a pressure-gauge after the cup was set to a soft rubber pad. In a second series of experiments (study 2) we investigated the stability of pressures in 20 consecutive trials in two experienced cupping practitioners and ten beginners using method AF. In study 1 all four methods yielded consistent pressures. Large differences in magnitude were, however, observed between methods (mean pressures -200±30 hPa with LF1, -310±30 hPa with LF2, -560±30 hPa with AF, and -270±16 hPa with BA). With method BA the standard deviation was reduced by a factor 2 compared to the flame methods. In study 2 beginners had considerably more difficulty obtaining a stable pressure yield than advanced cupping practitioners, showing a distinct learning curve before reaching expertise levels after about 10-20 trials. Cupping is reproducible if the exact method is described in detail. Mechanical suction with a balloon has the best reproducibility. Beginners need at least 10-20 trials to produce stable pressures. Copyright © 2010 Elsevier Ltd. All rights reserved.
Koch, Iris; Reimer, Kenneth J; Bakker, Martine I; Basta, Nicholas T; Cave, Mark R; Denys, Sébastien; Dodd, Matt; Hale, Beverly A; Irwin, Rob; Lowney, Yvette W; Moore, Margo M; Paquin, Viviane; Rasmussen, Pat E; Repaso-Subang, Theresa; Stephenson, Gladys L; Siciliano, Steven D; Wragg, Joanna; Zagury, Gerald J
2013-01-01
Bioaccessibility is a measurement of a substance's solubility in the human gastro-intestinal system, and is often used in the risk assessment of soils. The present study was designed to determine the variability among laboratories using different methods to measure the bioaccessibility of 24 inorganic contaminants in one standardized soil sample, the standard reference material NIST 2710. Fourteen laboratories used a total of 17 bioaccessibility extraction methods. The variability between methods was assessed by calculating the reproducibility relative standard deviations (RSDs), where reproducibility is the sum of within-laboratory and between-laboratory variability. Whereas within-laboratory repeatability was usually better than (<) 15% for most elements, reproducibility RSDs were much higher, indicating more variability, although for many elements they were comparable to typical uncertainties (e.g., 30% in commercial laboratories). For five trace elements of interest, reproducibility RSDs were: arsenic (As), 22-44%; cadmium (Cd), 11-41%; Cu, 15-30%; lead (Pb), 45-83%; and Zn, 18-56%. Only one method variable, pH, was found to correlate significantly with bioaccessibility for aluminum (Al), Cd, copper (Cu), manganese (Mn), Pb and zinc (Zn) but other method variables could not be examined systematically because of the study design. When bioaccessibility results were directly compared with bioavailability results for As (swine and mouse) and Pb (swine), four methods returned results within uncertainty ranges for both elements: two that were defined as simpler (gastric phase only, limited chemicals) and two were more complex (gastric + intestinal phases, with a mixture of chemicals).
Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G
2017-10-01
A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Vonk Noordegraaf, A; Kunst, P W; Janse, A; Smulders, R A; Heethaar, R M; Postmus, P E; Faes, T J; de Vries, P M
1997-03-01
The Sheffield electrical impedance tomography; (EIT) system produces images of changes in the distribution of resistivity within tissue. The paper reports on the application of electrical impedance tomography in monitoring volume changes in the limb during venous occlusion. The aim of the study is to assess the feasibility, reproducibility and validity of calf blood flow measurements by EIT. In 14 healthy volunteers calf blood flow is compared, as determined in a calf segment by strain-gauge plethysmography (SGP), with the impedance changes measured by EIT during rest and post-ischaemic hyperaemia. The measurements are repeated to assess reproducibility. The reproducibility for the EIT, assessed from the repeated measurements and expressed as a reproducibility coefficient, is 0.88 during rest and 0.89 during hyperaemia. The reproducibility coefficient for SGP data is 0.83 at rest and 0.67 during hyperaemia. Flow measurements, assessed by means of two methods, correlate well at rest (r = 0.89), but only moderately during hyperaemia (r = 0.51). The correlation coefficient for the pooled flow measurements is 0.98. It is concluded that EIT is a valid and reliable method for assessing blood flow in the limb. Possible applications of EIT in localising fluid changes are discussed.
Reusable hydroxyapatite nanocrystal sensors for protein adsorption.
Tagaya, Motohiro; Ikoma, Toshiyuki; Hanagata, Nobutaka; Chakarov, Dinko; Kasemo, Bengt; Tanaka, Junzo
2010-08-01
The repeatability of the adsorption and removal of fibrinogen and fetal bovine serum on hydroxyapatite (HAp) nanocrystal sensors was investigated by Fourier transform infrared (FTIR) spectroscopy and quartz crystal microbalance with dissipation (QCM-D) monitoring technique. The HAp nanocrystals were coated on a gold-coated quartz sensor by electrophoretic deposition. Proteins adsorbed on the HAp sensors were removed by (i) ammonia/hydrogen peroxide mixture (APM), (ii) ultraviolet light (UV), (iii) UV/APM, (iv) APM/UV and (v) sodium dodecyl sulfate (SDS) treatments. FTIR spectra of the reused surfaces revealed that the APM and SDS treatments left peptide fragments or the proteins adsorbed on the surfaces, whereas the other methods successfully removed the proteins. The QCM-D measurements indicated that in the removal treatments, fibrinogen was slowly adsorbed in the first cycle because of the change in surface wettability revealed by contact angle measurements. The SDS treatment was not effective in removing proteins. The APM or UV treatment decreased the frequency shifts for the reused HAp sensors. The UV/APM treatment did not induce the frequency shifts but decreased the dissipation shifts. Therefore, we conclude that the APM/UV treatment is the most useful method for reproducing protein adsorption behavior on HAp sensors.
The Kondo problem. II. Crossover from asymptotic freedom to infrared slavery
NASA Astrophysics Data System (ADS)
Schlottmann, P.
1982-04-01
In the preceding paper we transformed the s-d Hamiltonian onto a resonance level with a large perturbation and derived the scaling equations for the vertices, the invariant coupling, and the resonance width. The scaling equations are integrated under the assumption that the energy dependence of the resonance width can be neglected. The transcendental equation obtained in this way for the renormalized resonance width is solved in the relevant limits and allows a calculation of the static and dynamical susceptibility. At high temperatures the perturbation expansion for the relaxation rate and the susceptibility is reproduced up to third order in Jρ. At low temperatures the lifetime and χ0 remain finite and vary according to a Fermi-liquid theory. The approximation scheme interpolates in this way between the asymptotic freedom and the infrared slavery, yielding a smooth crossover. The present results are in quantitative agreement with previous ones obtained with the relaxation-kernel method by Götze and Schlottmann. The advantages and drawbacks of the method are discussed. The calculation of the dynamical susceptibility is extended to nonzero external magnetic fields. The quasielastic peak of χ''(ω)ω is suppressed at low temperatures and large magnetic fields and shoulders develop at ω=+/-B.
Reusable hydroxyapatite nanocrystal sensors for protein adsorption
NASA Astrophysics Data System (ADS)
Tagaya, Motohiro; Ikoma, Toshiyuki; Hanagata, Nobutaka; Chakarov, Dinko; Kasemo, Bengt; Tanaka, Junzo
2010-08-01
The repeatability of the adsorption and removal of fibrinogen and fetal bovine serum on hydroxyapatite (HAp) nanocrystal sensors was investigated by Fourier transform infrared (FTIR) spectroscopy and quartz crystal microbalance with dissipation (QCM-D) monitoring technique. The HAp nanocrystals were coated on a gold-coated quartz sensor by electrophoretic deposition. Proteins adsorbed on the HAp sensors were removed by (i) ammonia/hydrogen peroxide mixture (APM), (ii) ultraviolet light (UV), (iii) UV/APM, (iv) APM/UV and (v) sodium dodecyl sulfate (SDS) treatments. FTIR spectra of the reused surfaces revealed that the APM and SDS treatments left peptide fragments or the proteins adsorbed on the surfaces, whereas the other methods successfully removed the proteins. The QCM-D measurements indicated that in the removal treatments, fibrinogen was slowly adsorbed in the first cycle because of the change in surface wettability revealed by contact angle measurements. The SDS treatment was not effective in removing proteins. The APM or UV treatment decreased the frequency shifts for the reused HAp sensors. The UV/APM treatment did not induce the frequency shifts but decreased the dissipation shifts. Therefore, we conclude that the APM/UV treatment is the most useful method for reproducing protein adsorption behavior on HAp sensors.
Modern separation techniques coupled to high performance mass spectrometry for glycolipid analysis.
Sarbu, Mirela; Zamfir, Alina Diana
2018-01-21
Glycolipids (GLs), involved in biological processes and pathologies, such as viral, neurodegenerative and oncogenic transformations are in the focus of research related to method development for structural analysis. This review highlights modern separation techniques coupled to mass spectrometry (MS) for the investigation of GLs from various biological matrices. First section is dedicated to methods, which, although provide the separation in a non-liquid phase, are able to supply important data on the composition of complex mixtures. While classical thin layer chromatography (TLC) is useful for MS analyses of the fractionated samples, ultramodern ion mobility (IMS) characterized by high reproducibility facilitates to discover minor species and to apply low sample amounts, in addition to providing conformational separation with isomer discrimination. Second section highlights the advantages, applications and limitations of liquid-based separation techniques such as high performance liquid chromatography (HPLC) and hydrophilic interaction liquid chromatography (HILIC) in direct or indirect coupling to MS for glycolipidomics surveys. The on- and off-line capillary electrophoresis (CE) MS, offering a remarkable separation efficiency of GLs is also presented and critically assessed from the technical and application perspective in the final part of the review. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Thomas, Karluss; Herouet-Guicheney, Corinne; Ladics, Gregory; McClain, Scott; MacIntosh, Susan; Privalle, Laura; Woolhiser, Mike
2008-09-01
The International Life Science Institute's Health and Environmental Sciences Institute's Protein Allergenicity Technical Committee hosted an international workshop October 23-25, 2007, in Nice, France, to review and discuss existing and emerging methods and techniques for improving the current weight-of-evidence approach for evaluating the potential allergenicity of novel proteins. The workshop included over 40 international experts from government, industry, and academia. Their expertise represented a range of disciplines including immunology, chemistry, molecular biology, bioinformatics, and toxicology. Among participants, there was consensus that (1) current bioinformatic approaches are highly conservative; (2) advances in bioinformatics using structural comparisons of proteins may be helpful as the availability of structural data increases; (3) proteomics may prove useful for monitoring the natural variability in a plant's proteome and assessing the impact of biotechnology transformations on endogenous levels of allergens, but only when analytical techniques have been standardized and additional data are available on the natural variation of protein expression in non-transgenic bred plants; (4) basophil response assays are promising techniques, but need additional evaluation around specificity, sensitivity, and reproducibility; (5) additional research is required to develop and validate an animal model for the purpose of predicting protein allergenicity.
Samuei, Sara; Fakkar, Jila; Rezvani, Zolfaghar; Shomali, Ashkan; Habibi, Biuck
2017-03-15
In the present work, a novel nanocomposite based on the graphene quantum dots and CoNiAl-layered double-hydroxide was successfully synthesized by co-precipitation method. To achieve the morphological, structural and compositional information, the resulted nanocomposite was characterized by scanning electron microscopy X-ray diffraction, thermal gravimetric analysis, Fourier transform infrared spectroscopy, and photoluminescence. Then, the nanocomposite was used as a modifier to fabricate a modified carbon paste electrode as a non-enzymatic sensor for glucose determination. Electrochemical behavior and determination of glucose at the nanocomposite modified carbon paste electrode were investigated by cyclic voltammetry and chronoamperometry methods, respectively. The prepared sensor offered good electrocatalytic properties, fast response time, high reproducibility and stability. At the optimum conditions, the constructed sensor exhibits wide linear range; 0.01-14.0 mM with a detection limit of 6 μM (S/N = 3) and high sensitivity of 48.717 μAmM -1 . Finally, the sensor was successfully applied to determine the glucose in real samples which demonstrated its applicability. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Schanz, Martin; Ye, Wenjing; Xiao, Jinyou
2016-04-01
Transient problems can often be solved with transformation methods, where the inverse transformation is usually performed numerically. Here, the discrete Fourier transform in combination with the exponential window method is compared with the convolution quadrature method formulated as inverse transformation. Both are inverse Laplace transforms, which are formally identical but use different complex frequencies. A numerical study is performed, first with simple convolution integrals and, second, with a boundary element method (BEM) for elastodynamics. Essentially, when combined with the BEM, the discrete Fourier transform needs less frequency calculations, but finer mesh compared to the convolution quadrature method to obtain the same level of accuracy. If further fast methods like the fast multipole method are used to accelerate the boundary element method the convolution quadrature method is better, because the iterative solver needs much less iterations to converge. This is caused by the larger real part of the complex frequencies necessary for the calculation, which improves the conditions of system matrix.
Dekkers, A L M; Slob, W
2012-10-01
In dietary exposure assessment, statistical methods exist for estimating the usual intake distribution from daily intake data. These methods transform the dietary intake data to normal observations, eliminate the within-person variance, and then back-transform the data to the original scale. We propose Gaussian Quadrature (GQ), a numerical integration method, as an efficient way of back-transformation. We compare GQ with six published methods. One method uses a log-transformation, while the other methods, including GQ, use a Box-Cox transformation. This study shows that, for various parameter choices, the methods with a Box-Cox transformation estimate the theoretical usual intake distributions quite well, although one method, a Taylor approximation, is less accurate. Two applications--on folate intake and fruit consumption--confirmed these results. In one extreme case, some methods, including GQ, could not be applied for low percentiles. We solved this problem by modifying GQ. One method is based on the assumption that the daily intakes are log-normally distributed. Even if this condition is not fulfilled, the log-transformation performs well as long as the within-individual variance is small compared to the mean. We conclude that the modified GQ is an efficient, fast and accurate method for estimating the usual intake distribution. Copyright © 2012 Elsevier Ltd. All rights reserved.
Determination of rare-earth elements in Luna 16 regolith sample by chemical spectral method
NASA Technical Reports Server (NTRS)
Stroganova, N. S.; Ryabukhin, V. A.; Laktinova, N. V.; Ageyeva, L. V.; Galkina, I. P.; Gatinskaya, N. G.; Yermakov, A. N.; Karyakin, A. V.
1974-01-01
An analysis was made of regolith from layer A of the Luna 16 sample for rare earth elements, by a chemical spectral method. Chemical and ion exchange concentrations were used to determine the content of 12 elements and Y at the level 0.001 to 0.0001 percent with 10 to 15 percent reproducibility of the emission determination. Results within the limits of reproducibility agree with data obtained by mass spectra, activation, and X-ray fluorescent methods.
Peng, Ying; Chen, Xin; Sato, Takuya; Rankin, Scott A; Tsuji, Ryohei F; Ge, Ying
2012-04-03
Human salivary α-amylase (HSAMY) is a major component of salivary secretions, possessing multiple important biological functions. Here we have established three methods to purify HSAMY in human saliva for comprehensive characterization of HSAMY by high-resolution top-down mass spectrometry (MS). Among the three purification methods, the affinity method based on the enzyme-substrate specific interaction between amylase and glycogen is preferred, providing the highest purity HSAMY with high reproducibility. Subsequently, we employed Fourier transform ion cyclotron resonance MS to analyze the purified HSAMY. The predominant form of α-amylase purified from saliva of various races and genders is nonglycosylated with the same molecular weight of 55,881.2, which is 1885.8 lower than the calculated value based on the DNA-predicted sequence. High-resolution MS revealed the truncation of the first 15 N-terminal amino acids (-1858.96) and the subsequent formation of pyroglutamic acid at the new N-terminus Gln (-17.03). More importantly, five disulfide bonds in HSAMY were identified (-10.08) and effectively localized by tandem MS in conjunction with complete and partial reduction by tris (2-carboxyethyl) phosphine. Overall, this study demonstrates that top-down MS combined with affinity purification and partial reduction is a powerful method for rapid purification and complete characterization of large proteins with complex and overlapping disulfide bond patterns.
Fast frequency domain method to detect skew in a document image
NASA Astrophysics Data System (ADS)
Mehta, Sunita; Walia, Ekta; Dutta, Maitreyee
2015-12-01
In this paper, a new fast frequency domain method based on Discrete Wavelet Transform and Fast Fourier Transform has been implemented for the determination of the skew angle in a document image. Firstly, image size reduction is done by using two-dimensional Discrete Wavelet Transform and then skew angle is computed using Fast Fourier Transform. Skew angle error is almost negligible. The proposed method is experimented using a large number of documents having skew between -90° and +90° and results are compared with Moments with Discrete Wavelet Transform method and other commonly used existing methods. It has been determined that this method works more efficiently than the existing methods. Also, it works with typed, picture documents having different fonts and resolutions. It overcomes the drawback of the recently proposed method of Moments with Discrete Wavelet Transform that does not work with picture documents.
NASA Astrophysics Data System (ADS)
Rao, T. R. Ramesh
2018-04-01
In this paper, we study the analytical method based on reduced differential transform method coupled with sumudu transform through Pades approximants. The proposed method may be considered as alternative approach for finding exact solution of Gas dynamics equation in an effective manner. This method does not require any discretization, linearization and perturbation.
Ghaste, Manoj; Mistrik, Robert; Shulaev, Vladimir
2016-05-25
Metabolomics, along with other "omics" approaches, is rapidly becoming one of the major approaches aimed at understanding the organization and dynamics of metabolic networks. Mass spectrometry is often a technique of choice for metabolomics studies due to its high sensitivity, reproducibility and wide dynamic range. High resolution mass spectrometry (HRMS) is a widely practiced technique in analytical and bioanalytical sciences. It offers exceptionally high resolution and the highest degree of structural confirmation. Many metabolomics studies have been conducted using HRMS over the past decade. In this review, we will explore the latest developments in Fourier transform mass spectrometry (FTMS) and Orbitrap based metabolomics technology, its advantages and drawbacks for using in metabolomics and lipidomics studies, and development of novel approaches for processing HRMS data.
Ghaste, Manoj; Mistrik, Robert; Shulaev, Vladimir
2016-01-01
Metabolomics, along with other “omics” approaches, is rapidly becoming one of the major approaches aimed at understanding the organization and dynamics of metabolic networks. Mass spectrometry is often a technique of choice for metabolomics studies due to its high sensitivity, reproducibility and wide dynamic range. High resolution mass spectrometry (HRMS) is a widely practiced technique in analytical and bioanalytical sciences. It offers exceptionally high resolution and the highest degree of structural confirmation. Many metabolomics studies have been conducted using HRMS over the past decade. In this review, we will explore the latest developments in Fourier transform mass spectrometry (FTMS) and Orbitrap based metabolomics technology, its advantages and drawbacks for using in metabolomics and lipidomics studies, and development of novel approaches for processing HRMS data. PMID:27231903
Crystallization Dynamics of Organolead Halide Perovskite by Real-Time X-ray Diffraction.
Miyadera, Tetsuhiko; Shibata, Yosei; Koganezawa, Tomoyuki; Murakami, Takurou N; Sugita, Takeshi; Tanigaki, Nobutaka; Chikamatsu, Masayuki
2015-08-12
We analyzed the crystallization process of the CH3NH3PbI3 perovskite by observing real-time X-ray diffraction immediately after combining a PbI2 thin film with a CH3NH3I solution. A detailed analysis of the transformation kinetics demonstrated the fractal diffusion of the CH3NH3I solution into the PbI2 film. Moreover, the perovskite crystal was found to be initially oriented based on the PbI2 crystal orientation but to gradually transition to a random orientation. The fluctuating characteristics of the crystallization process of perovskites, such as fractal penetration and orientational transformation, should be controlled to allow the fabrication of high-quality perovskite crystals. The characteristic reaction dynamics observed in this study should assist in establishing reproducible fabrication processes for perovskite solar cells.
Magnetism in graphene oxide induced by epoxy groups
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Dongwook, E-mail: dongwookleedl324@gmail.com; Division of Physics and Applied Physics, Nanyang Technological University, Singapore 637371; Seo, Jiwon, E-mail: jiwonseo@yonsei.ac.kr
2015-04-27
We have engineered magnetism in graphene oxide. Our approach transforms graphene into a magnetic insulator while maintaining graphene's structure. Fourier transform infrared spectroscopy spectra reveal that graphene oxide has various chemical groups (including epoxy, ketone, hydroxyl, and C-O groups) on its surface. Destroying the epoxy group with heat treatment or chemical treatment diminishes magnetism in the material. Local density approximation calculation results well reproduce the magnetic moments obtained from experiments, and these results indicate that the unpaired spin induced by the presence of epoxy groups is the origin of the magnetism. The calculation results also explain the magnetic properties, whichmore » are generated by the interaction between separated magnetic regions and domains. Our results demonstrate tunable magnetism in graphene oxide based on controlling the epoxy group with heat or chemical treatment.« less
Magnetism in graphene oxide induced by epoxy groups
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Dongwook; Seo, Jiwon; Zhu, Xi
2015-04-27
We have engineered magnetism in graphene oxide. Our approach transforms graphene into a magnetic insulator while maintaining graphene's structure. Fourier transform infrared spectroscopy spectra reveal that graphene oxide has various chemical groups (including epoxy, ketone, hydroxyl, and C-O groups) on its surface. Destroying the epoxy group with heat treatment or chemical treatment diminishes magnetism in the material. Local Density Approximation calculation results well reproduce the magnetic moments obtained from experiments, and these results indicate that the unpaired spin induced by the presence of epoxy groups is the origin of the magnetism. The calculation results also explain the magnetic properties, whichmore » is generated by the interaction between separated magnetic regions and domains. Our results demonstrate tunable magnetism in graphene oxide based on controlling the epoxy group with heat or chemical treatment.« less
NASA Astrophysics Data System (ADS)
Ma, Long; Zhao, Deping
2011-12-01
Spectral imaging technology have been used mostly in remote sensing, but have recently been extended to new area requiring high fidelity color reproductions like telemedicine, e-commerce, etc. These spectral imaging systems are important because they offer improved color reproduction quality not only for a standard observer under a particular illuminantion, but for any other individual exhibiting normal color vision capability under another illuminantion. A possibility for browsing of the archives is needed. In this paper, the authors present a new spectral image browsing architecture. The architecture for browsing is expressed as follow: (1) The spectral domain of the spectral image is reduced with the PCA transform. As a result of the PCA transform the eigenvectors and the eigenimages are obtained. (2) We quantize the eigenimages with the original bit depth of spectral image (e.g. if spectral image is originally 8bit, then quantize eigenimage to 8bit), and use 32bit floating numbers for the eigenvectors. (3) The first eigenimage is lossless compressed by JPEG-LS, the other eigenimages were lossy compressed by wavelet based SPIHT algorithm. For experimental evalution, the following measures were used. We used PSNR as the measurement for spectral accuracy. And for the evaluation of color reproducibility, ΔE was used.here standard D65 was used as a light source. To test the proposed method, we used FOREST and CORAL spectral image databases contrain 12 and 10 spectral images, respectively. The images were acquired in the range of 403-696nm. The size of the images were 128*128, the number of bands was 40 and the resolution was 8 bits per sample. Our experiments show the proposed compression method is suitable for browsing, i.e., for visual purpose.
FFT-enhanced IHS transform method for fusing high-resolution satellite images
Ling, Y.; Ehlers, M.; Usery, E.L.; Madden, M.
2007-01-01
Existing image fusion techniques such as the intensity-hue-saturation (IHS) transform and principal components analysis (PCA) methods may not be optimal for fusing the new generation commercial high-resolution satellite images such as Ikonos and QuickBird. One problem is color distortion in the fused image, which causes visual changes as well as spectral differences between the original and fused images. In this paper, a fast Fourier transform (FFT)-enhanced IHS method is developed for fusing new generation high-resolution satellite images. This method combines a standard IHS transform with FFT filtering of both the panchromatic image and the intensity component of the original multispectral image. Ikonos and QuickBird data are used to assess the FFT-enhanced IHS transform method. Experimental results indicate that the FFT-enhanced IHS transform method may improve upon the standard IHS transform and the PCA methods in preserving spectral and spatial information. ?? 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
NASA Astrophysics Data System (ADS)
Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi
2016-06-01
A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.
Joo, Hyun-Woo; Lee, Chang-Hwan; Rho, Jong-Seok; Jung, Hyun-Kyo
2003-08-01
In this paper, an inversion scheme for piezoelectric constants of piezoelectric transformers is proposed. The impedance of piezoelectric transducers is calculated using a three-dimensional finite element method. The validity of this is confirmed experimentally. The effects of material coefficients on piezoelectric transformers are investigated numerically. Six material coefficient variables for piezoelectric transformers were selected, and a design sensitivity method was adopted as an inversion scheme. The validity of the proposed method was confirmed by step-up ratio calculations. The proposed method is applied to the analysis of a sample piezoelectric transformer, and its resonance characteristics are obtained by numerically combined equivalent circuit method.
Accelerometric gait analysis for use in hospital outpatients.
Auvinet, B; Chaleil, D; Barrey, E
1999-01-01
To provide clinicians with a quantitative human gait analysis tool suitable for routine use. We evaluated the reproducibility, sensitivity, and specificity of gait analysis based on measurements of acceleration at a point near the center of gravity of the body. Two accelerometers held over the middle of the low back by a semi-elastic belt were used to record craniocaudal and side-to-side accelerations at a frequency of 50 Hz. Subjects were asked to walk at their normal speed to the end of a straight 40 meter-long hospital corridor and back. A 20-second period of stabilized walking was used to calculate cycle frequency, stride symmetry, and stride regularity. Symmetry and regularity were each derived from an auto-correlation coefficient; to convert their distribution from nonnormal to normal, Fisher's Z transformation was applied to the auto-coefficients for these two variables. Intraobserver reproducibility was evaluated by asking the same observer to test 16 controls on three separate occasions at two-day intervals and interobserver reproducibility by asking four different observers to each test four controls (Latin square). Specificity and sensitivity were determined by testing 139 controls and 63 patients. The 139 controls (70 women and 69 men) were divided into five age groups (third through seventh decades of life). The 63 patients had a noninflammatory musculoskeletal condition predominating on one side. ROC curves were used to determine the best cutoffs for separating normal from abnormal values. Neither intra- nor interobserver variability was significant (P > 0.05). Cycle frequency was significantly higher in female than in male controls (1.05 +/- 0.06 versus 0.98 +/- 0.05 cycles/s; P < 0.001). Neither symmetry nor regularity were influenced by gender in the controls; both variables were also unaffected by age, although nonsignificant decreases were found in the 61 to 70-year age group, which included only nine subjects. In the ROC curve analysis, the area under the curve was high for all three variables (frequency, 0.81 +/- 0.04; symmetry, 0.85 +/- 0.03; and regularity, 0.88 +/- 0.03), establishing that there was a good compromise between sensitivity and specificity. Our gait analysis method offers satisfactory reproducibility and is sufficiently sensitive and specific to be used by clinicians in the quantitative evaluation of gait abnormalities.
Reproducing Epidemiologic Research and Ensuring Transparency.
Coughlin, Steven S
2017-08-15
Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Parallelization of the Physical-Space Statistical Analysis System (PSAS)
NASA Technical Reports Server (NTRS)
Larson, J. W.; Guo, J.; Lyster, P. M.
1999-01-01
Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational reproducibility is well known in the parallel computing community. It is a requirement that the parallel code perform calculations in a fashion that will yield identical results on different configurations of processing elements on the same platform. In some cases this problem can be solved by sacrificing performance. Meeting this requirement and still achieving high performance is very difficult. Topics to be discussed include: current PSAS design and parallelization strategy; reproducibility issues; load balance vs. database memory demands, possible solutions to these problems.
Cervical vertebrae maturation method morphologic criteria: poor reproducibility.
Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E
2011-08-01
The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects that could be assigned a CVM stage, the level of interobserver agreement as measured by the Kendall coefficient of concordance was only 0.45, indicating moderate agreement. The weakness of the CVM method results, in part, from difficulty in classifying the vertebral bodies of C3 and C4 as trapezoidal, rectangular horizontal, square, or rectangular vertical. This led to the overall poor reproducibility of the CVM method and our inability to support its use as a strict clinical guideline for the timing of orthodontic treatment. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
A method of power analysis based on piecewise discrete Fourier transform
NASA Astrophysics Data System (ADS)
Xin, Miaomiao; Zhang, Yanchi; Xie, Da
2018-04-01
The paper analyzes the existing feature extraction methods. The characteristics of discrete Fourier transform and piecewise aggregation approximation are analyzed. Combining with the advantages of the two methods, a new piecewise discrete Fourier transform is proposed. And the method is used to analyze the lighting power of a large customer in this paper. The time series feature maps of four different cases are compared with the original data, discrete Fourier transform, piecewise aggregation approximation and piecewise discrete Fourier transform. This new method can reflect both the overall trend of electricity change and its internal changes in electrical analysis.
NASA Astrophysics Data System (ADS)
Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin
2015-03-01
Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.
Agin, Patricia Poh; Edmonds, Susan H
2002-08-01
The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.
A Transfer Voltage Simulation Method for Generator Step Up Transformers
NASA Astrophysics Data System (ADS)
Funabashi, Toshihisa; Sugimoto, Toshirou; Ueda, Toshiaki; Ametani, Akihiro
It has been found from measurements for 13 sets of GSU transformers that a transfer voltage of a generator step-up (GSU) transformer involves one dominant oscillation frequency. The frequency can be estimated from the inductance and capacitance values of the GSU transformer low-voltage-side. This observation has led to a new method for simulating a GSU transformer transfer voltage. The method is based on the EMTP TRANSFORMER model, but stray capacitances are added. The leakage inductance and the magnetizing resistance are modified using approximate curves for their frequency characteristics determined from the measured results. The new method is validated in comparison with the measured results.
Computing Instantaneous Frequency by normalizing Hilbert Transform
NASA Technical Reports Server (NTRS)
Huang, Norden E. (Inventor)
2005-01-01
This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.
Computing Instantaneous Frequency by normalizing Hilbert Transform
Huang, Norden E.
2005-05-31
This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.
Electrosprayed chitosan nanoparticles: facile and efficient approach for bacterial transformation
NASA Astrophysics Data System (ADS)
Abyadeh, Morteza; Sadroddiny, Esmaeil; Ebrahimi, Ammar; Esmaeili, Fariba; Landi, Farzaneh Saeedi; Amani, Amir
2017-12-01
A rapid and efficient procedure for DNA transformation is a key prerequisite for successful cloning and genomic studies. While there are efforts to develop a facile method, so far obtained efficiencies for alternative methods have been unsatisfactory (i.e. 105-106 CFU/μg plasmid) compared with conventional method (up to 108 CFU/μg plasmid). In this work, for the first time, we prepared chitosan/pDNA nanoparticles by electrospraying methods to improve transformation process. Electrospray method was used for chitosan/pDNA nanoparticles production to investigate the non-competent bacterial transformation efficiency; besides, the effect of chitosan molecular weight, N/P ratio and nanoparticle size on non-competent bacterial transformation efficiency was evaluated too. The results showed that transformation efficiency increased with decreasing the molecular weight, N/P ratio and nanoparticles size. In addition, transformation efficiency of 1.7 × 108 CFU/μg plasmid was obtained with chitosan molecular weight, N/P ratio and nanoparticles size values of 30 kDa, 1 and 125 nm. Chitosan/pDNA electrosprayed nanoparticles were produced and the effect of molecular weight, N/P and size of nanoparticles on transformation efficiency was evaluated. In total, we present a facile and rapid method for bacterial transformation, which has comparable efficiency with the common method.
Linear shoaling of free-surface waves in multi-layer non-hydrostatic models
NASA Astrophysics Data System (ADS)
Bai, Yefei; Cheung, Kwok Fai
2018-01-01
The capability to describe shoaling over sloping bottom is fundamental to modeling of coastal wave transformation. The linear shoaling gradient provides a metric to measure this property in non-hydrostatic models with layer-integrated formulations. The governing equations in Boussinesq form facilitate derivation of the linear shoaling gradient, which is in the form of a [ 2 P + 2 , 2 P ] expansion of the water depth parameter kd with P equal to 1 for a one-layer model and (4 N - 4) for an N-layer model. The expansion reproduces the analytical solution from Airy wave theory at the shallow water limit and maintains a reasonable approximation up to kd = 1.2 and 2 for the one and two-layer models. Additional layers provide rapid and monotonic convergence of the shoaling gradient into deep water. Numerical experiments of wave propagation over a plane slope illustrate manifestation of the shoaling errors through the transformation processes from deep to shallow water. Even though outside the zone of active wave transformation, shoaling errors from deep to intermediate water are cumulative to produce appreciable impact to the wave amplitude in shallow water.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betz, B.; École Polytechnique Fédérale de Lausanne, NXMM Laboratory, IMX, CH-1015 Lausanne; Rauscher, P.
The performance and degree of efficiency of industrial transformers are directly influenced by the magnetic properties of high-permeability steel laminations (HPSLs). Industrial transformer cores are built of stacks of single HPSLs. While the insulating coating on each HPSL reduces eddy-current losses in the transformer core, the coating also induces favorable inter-granular tensile stresses that significantly influence the underlying magnetic domain structure. Here, we show that the neutron dark-field image can be used to analyze the influence of the coating on the volume and supplementary surface magnetic domain structures. To visualize the stress effect of the coating on the bulk domainmore » formation, we used an uncoated HPSL and stepwise increased the applied external tensile stress up to 20 MPa. We imaged the domain configuration of the intermediate stress states and were able to reproduce the original domain structure of the coated state. Furthermore, we were able to visualize how the applied stresses lead to a refinement of the volume domain structure and the suppression and reoccurrence of supplementary domains.« less
Minimal gravity and Frobenius manifolds: bulk correlation on sphere and disk
NASA Astrophysics Data System (ADS)
Aleshkin, Konstantin; Belavin, Vladimir; Rim, Chaiho
2017-11-01
There are two alternative approaches to the minimal gravity — direct Liouville approach and matrix models. Recently there has been a certain progress in the matrix model approach, growing out of presence of a Frobenius manifold (FM) structure embedded in the theory. The previous studies were mainly focused on the spherical topology. Essentially, it was shown that the action principle of Douglas equation allows to define the free energy and to compute the correlation numbers if the resonance transformations are properly incorporated. The FM structure allows to find the explicit form of the resonance transformation as well as the closed expression for the partition function. In this paper we elaborate on the case of gravitating disk. We focus on the bulk correlators and show that in the similar way as in the closed topology the generating function can be formulated using the set of flat coordinates on the corresponding FM. Moreover, the resonance transformations, which follow from the spherical topology consideration, are exactly those needed to reproduce FZZ result of the Liouville gravity approach.
NASA Astrophysics Data System (ADS)
Colla, V.; Desanctis, M.; Dimatteo, A.; Lovicu, G.; Valentini, R.
2011-09-01
The purpose of the present work is the implementation and validation of a model able to predict the microstructure changes and the mechanical properties in the modern high-strength dual-phase steels after the continuous annealing process line (CAPL) and galvanizing (Galv) process. Experimental continuous cooling transformation (CCT) diagrams for 13 differently alloying dual-phase steels were measured by dilatometry from the intercritical range and were used to tune the parameters of the microstructural prediction module of the model. Mechanical properties and microstructural features were measured for more than 400 dual-phase steels simulating the CAPL and Galv industrial process, and the results were used to construct the mechanical model that predicts mechanical properties from microstructural features, chemistry, and process parameters. The model was validated and proved its efficiency in reproducing the transformation kinetic and mechanical properties of dual-phase steels produced by typical industrial process. Although it is limited to the dual-phase grades and chemical compositions explored, this model will constitute a useful tool for the steel industry.
Zhong, H.; Sun, B.; Warkentin, D.; Zhang, S.; Wu, R.; Wu, T.; Sticklen, M. B.
1996-01-01
We have developed a novel and reproducible system for recovery of fertile transgenic maize (Zea mays L.) plants. The transformation was performed using microprojectile bombardment of cultured shoot apices of maize with a plasmid carrying two linked genes, the Streptomyces hygroscopicus phosphinothricin acetyltransferase gene (bar) and the potato proteinase inhibitor II gene, either alone or in combination with another plasmid containing the 5[prime] region of the rice actin 1 gene fused to the Escherichia coli [beta]-glucuronidase gene (gus). Bombarded shoot apices were subsequently multiplied and selected under 3 to 5 mg/L glufosinate ammonium. Co-transformation frequency was 100% (146/146) for linked genes and 80% (41/51) for unlinked genes. Co-expression frequency of the bar and gus genes was 57% (29/51). The co-integration, co-inheritance, and co-expression of bar, the potato proteinase inhibitor II gene, and gus in transgenic R0, R1, and R2 plants were confirmed. Localized expression of the actin 1-GUS protein in the R0 and R1 plants was extensively analyzed by histochemical and fluorometric assays. PMID:12226244
Geng, Haijiang; Li, Zhihui; Li, Jiabing; Lu, Tao; Yan, Fangrong
2015-01-01
BACKGROUND Personalized cancer treatments depend on the determination of a patient's genetic status according to known genetic profiles for which targeted treatments exist. Such genetic profiles must be scientifically validated before they is applied to general patient population. Reproducibility of findings that support such genetic profiles is a fundamental challenge in validation studies. The percentage of overlapping genes (POG) criterion and derivative methods produce unstable and misleading results. Furthermore, in a complex disease, comparisons between different tumor subtypes can produce high POG scores that do not capture the consistencies in the functions. RESULTS We focused on the quality rather than the quantity of the overlapping genes. We defined the rank value of each gene according to importance or quality by PageRank on basis of a particular topological structure. Then, we used the p-value of the rank-sum of the overlapping genes (PRSOG) to evaluate the quality of reproducibility. Though the POG scores were low in different studies of the same disease, the PRSOG was statistically significant, which suggests that sets of differentially expressed genes might be highly reproducible. CONCLUSIONS Evaluations of eight datasets from breast cancer, lung cancer and four other disorders indicate that quality-based PRSOG method performs better than a quantity-based method. Our analysis of the components of the sets of overlapping genes supports the utility of the PRSOG method. PMID:26556852
Ren, Jun; Lee, Haram; Yoo, Seung Min; Yu, Myeong-Sang; Park, Hansoo; Na, Dokyun
2017-04-01
DNA transformation that delivers plasmid DNAs into bacterial cells is fundamental in genetic manipulation to engineer and study bacteria. Developed transformation methods to date are optimized to specific bacterial species for high efficiency. Thus, there is always a demand for simple and species-independent transformation methods. We herein describe the development of a chemico-physical transformation method that combines a rubidium chloride (RbCl)-based chemical method and sepiolite-based physical method, and report its use for the simple and efficient delivery of DNA into various bacterial species. Using this method, the best transformation efficiency for Escherichia coli DH5α was 4.3×10 6 CFU/μg of pUC19 plasmid, which is higher than or comparable to the reported transformation efficiencies to date. This method also allowed the introduction of plasmid DNAs into Bacillus subtilis (5.7×10 3 CFU/μg of pSEVA3b67Rb), Bacillus megaterium (2.5×10 3 CFU/μg of pSPAsp-hp), Lactococcus lactis subsp. lactis (1.0×10 2 CFU/μg of pTRKH3-ermGFP), and Lactococcus lactis subsp. cremoris (2.2×10 2 CFU/μg of pMSP3535VA). Remarkably, even when the conventional chemical and physical methods failed to generate transformed cells in Bacillus sp. and Enterococcus faecalis, E. malodoratus and E. mundtii, our combined method showed a significant transformation efficiency (2.4×10 4 , 4.5×10 2 , 2×10 1 , and 0.5×10 1 CFU/μg of plasmid DNA). Based on our results, we anticipate that our simple and efficient transformation method should prove usefulness for introducing DNA into various bacterial species without complicated optimization of parameters affecting DNA entry into the cell. Copyright © 2017. Published by Elsevier B.V.
Development of an Aerosol Surface Inoculation Method for Bacillus Spores ▿
Lee, Sang Don; Ryan, Shawn P.; Snyder, Emily Gibb
2011-01-01
A method was developed to deposit Bacillus subtilis spores via aerosolization onto various surface materials for biological agent decontamination and detection studies. This new method uses an apparatus coupled with a metered dose inhaler to reproducibly deposit spores onto various surfaces. A metered dose inhaler was loaded with Bacillus subtilis spores, a surrogate for Bacillus anthracis. Five different material surfaces (aluminum, galvanized steel, wood, carpet, and painted wallboard paper) were tested using this spore deposition method. This aerosolization method deposited spores at a concentration of more than 107 CFU per coupon (18-mm diameter) with less than a 50% coefficient of variation, showing that the aerosolization method developed in this study can deposit reproducible numbers of spores onto various surface coupons. Scanning electron microscopy was used to probe the spore deposition patterns on test coupons. The deposition patterns observed following aerosol impaction were compared to those of liquid inoculation. A physical difference in the spore deposition patterns was observed to result from the two different methods. The spore deposition method developed in this study will help prepare spore coupons via aerosolization fast and reproducibly for bench top decontamination and detection studies. PMID:21193670
Development of an aerosol surface inoculation method for bacillus spores.
Lee, Sang Don; Ryan, Shawn P; Snyder, Emily Gibb
2011-03-01
A method was developed to deposit Bacillus subtilis spores via aerosolization onto various surface materials for biological agent decontamination and detection studies. This new method uses an apparatus coupled with a metered dose inhaler to reproducibly deposit spores onto various surfaces. A metered dose inhaler was loaded with Bacillus subtilis spores, a surrogate for Bacillus anthracis. Five different material surfaces (aluminum, galvanized steel, wood, carpet, and painted wallboard paper) were tested using this spore deposition method. This aerosolization method deposited spores at a concentration of more than 10(7) CFU per coupon (18-mm diameter) with less than a 50% coefficient of variation, showing that the aerosolization method developed in this study can deposit reproducible numbers of spores onto various surface coupons. Scanning electron microscopy was used to probe the spore deposition patterns on test coupons. The deposition patterns observed following aerosol impaction were compared to those of liquid inoculation. A physical difference in the spore deposition patterns was observed to result from the two different methods. The spore deposition method developed in this study will help prepare spore coupons via aerosolization fast and reproducibly for bench top decontamination and detection studies.
Reproducible segmentation of white matter hyperintensities using a new statistical definition.
Damangir, Soheil; Westman, Eric; Simmons, Andrew; Vrenken, Hugo; Wahlund, Lars-Olof; Spulber, Gabriela
2017-06-01
We present a method based on a proposed statistical definition of white matter hyperintensities (WMH), which can work with any combination of conventional magnetic resonance (MR) sequences without depending on manually delineated samples. T1-weighted, T2-weighted, FLAIR, and PD sequences acquired at 1.5 Tesla from 119 subjects from the Kings Health Partners-Dementia Case Register (healthy controls, mild cognitive impairment, Alzheimer's disease) were used. The segmentation was performed using a proposed definition for WMH based on the one-tailed Kolmogorov-Smirnov test. The presented method was verified, given all possible combinations of input sequences, against manual segmentations and a high similarity (Dice 0.85-0.91) was observed. Comparing segmentations with different input sequences to one another also yielded a high similarity (Dice 0.83-0.94) that exceeded intra-rater similarity (Dice 0.75-0.91). We compared the results with those of other available methods and showed that the segmentation based on the proposed definition has better accuracy and reproducibility in the test dataset used. Overall, the presented definition is shown to produce accurate results with higher reproducibility than manual delineation. This approach can be an alternative to other manual or automatic methods not only because of its accuracy, but also due to its good reproducibility.
Huang Foen Chung, J W N C; Bohnen, A M; Pel, J J M; Bosch, J L H R; Niesing, R; van Mastrigt, R
2004-01-01
To report on the applicability, reproducibility, and adverse events of the noninvasive condom catheter method in the first 730 subjects of a longitudinal survey of changes in urinary bladder contractility secondary to benign prostatic hyperplasia, in which 1300 men will be evaluated three times in 5 years using this method. Subjects were recruited by general practitioners, general publicity, and e-mail. Only those meeting the study criteria were entered in the study. If the free flow rate exceeded 5.4 mL/s, at least two consecutive condom pressure measurements were attempted using the condom catheter method. The condom pressure measured reflected the isovolumetric bladder pressure, a measure of urinary bladder contractility. The reproducibility of the method was quantified by a difference plot of the two maximal condom pressures measured in each subject. In 618 (94%) of 659 eligible participants, one condom pressure measurement was completed; two measurements were done in 555 (84%). The maximal condom pressure ranged from 28 to 228 cm H2O (overall mean 101, SD 34). A difference between the two pressures of less than +/-21 cm H2O was found in 80%. The mean difference was -1 cm H2O (SD 18), significantly different from 0. Some adverse events such as terminal self-limiting hematuria were encountered. The condom catheter method is very suitable for large-scale use. It has a success rate of 94% and a reproducibility comparable to that of invasive pressure flow studies.
Transformative, Mixed Methods Checklist for Psychological Research with Mexican Americans
ERIC Educational Resources Information Center
Canales, Genevieve
2013-01-01
This is a description of the creation of a research methods tool, the "Transformative, Mixed Methods Checklist for Psychological Research With Mexican Americans." For conducting literature reviews of and planning mixed methods studies with Mexican Americans, it contains evaluative criteria calling for transformative mixed methods, perspectives…
The Filtered Abel Transform and Its Application in Combustion Diagnostics
NASA Technical Reports Server (NTRS)
Simons, Stephen N. (Technical Monitor); Yuan, Zeng-Guang
2003-01-01
Many non-intrusive combustion diagnosis methods generate line-of-sight projections of a flame field. To reconstruct the spatial field of the measured properties, these projections need to be deconvoluted. When the spatial field is axisymmetric, commonly used deconvolution method include the Abel transforms, the onion peeling method and the two-dimensional Fourier transform method and its derivatives such as the filtered back projection methods. This paper proposes a new approach for performing the Abel transform method is developed, which possesses the exactness of the Abel transform and the flexibility of incorporating various filters in the reconstruction process. The Abel transform is an exact method and the simplest among these commonly used methods. It is evinced in this paper that all the exact reconstruction methods for axisymmetric distributions must be equivalent to the Abel transform because of its uniqueness and exactness. Detailed proof is presented to show that the two dimensional Fourier methods when applied to axisymmetric cases is identical to the Abel transform. Discrepancies among various reconstruction method stem from the different approximations made to perform numerical calculations. An equation relating the spectrum of a set of projection date to that of the corresponding spatial distribution is obtained, which shows that the spectrum of the projection is equal to the Abel transform of the spectrum of the corresponding spatial distribution. From the equation, if either the projection or the distribution is bandwidth limited, the other is also bandwidth limited, and both have the same bandwidth. If the two are not bandwidth limited, the Abel transform has a bias against low wave number components in most practical cases. This explains why the Abel transform and all exact deconvolution methods are sensitive to high wave number noises. The filtered Abel transform is based on the fact that the Abel transform of filtered projection data is equal to an integral transform of the original projection data with the kernel function being the Abel transform of the filtering function. The kernel function is independent of the projection data and can be obtained separately when the filtering function is selected. Users can select the best filtering function for a particular set of experimental data. When the kernal function is obtained, it can be used repeatedly to a number of projection data sets (rovs) from the same experiment. When an entire flame image that contains a large number of projection lines needs to be processed, the new approach significantly reduces computational effort in comparison with the conventional approach in which each projection data set is deconvoluted separately. Computer codes have been developed to perform the filter Abel transform for an entire flame field. Measured soot volume fraction data of a jet diffusion flame are processed as an example.
Wang, Dongmei; Mu, Juan; Chen, Yan; ...
2017-03-01
The stress-induced phase transformation and micromechanical behavior of CuZr-based alloy were investigated by in-situ neutron diffraction. The pseudoelastic behavior with a pronounced strain-hardening effect is observed. The retained martensite nuclei and the residual stress obtained from the 1st cycle reduce the stress threshold for the martensitic transformation. A critical stress level is required for the reverse martensitic transformation from martensite to B2 phase. An increase of intensity for the B2 (110) plane in the 1st cycle is caused by the twinning along the {112}<111> twinning system. The convoluted stress partitioning influenced by the elastic and transformation anisotropy along with themore » newly formed martensite determines the microstress partitioning of the studied CuZr-based alloy. The reversible martensitic transformation is responsible for the pseudoelasticity. The macro mechanical behavior of the pure B2 phase can be divided into 3 stages, which are mediated by the evolvement of the martensitic transformation. This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dongmei; Mu, Juan; Chen, Yan
The stress-induced phase transformation and micromechanical behavior of CuZr-based alloy were investigated by in-situ neutron diffraction. The pseudoelastic behavior with a pronounced strain-hardening effect is observed. The retained martensite nuclei and the residual stress obtained from the 1st cycle reduce the stress threshold for the martensitic transformation. A critical stress level is required for the reverse martensitic transformation from martensite to B2 phase. An increase of intensity for the B2 (110) plane in the 1st cycle is caused by the twinning along the {112}<111> twinning system. The convoluted stress partitioning influenced by the elastic and transformation anisotropy along with themore » newly formed martensite determines the microstress partitioning of the studied CuZr-based alloy. The reversible martensitic transformation is responsible for the pseudoelasticity. The macro mechanical behavior of the pure B2 phase can be divided into 3 stages, which are mediated by the evolvement of the martensitic transformation. This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).« less
The KS Method in Light of Generalized Euler Parameters.
1980-01-01
motion for the restricted two-body problem is trans- formed via the Kustaanheimo - Stiefel transformation method (KS) into a dynamical equation in the... Kustaanheimo - Stiefel2 transformation method (KS) in the two-body problem. Many papers have appeared in which specific problems or applications have... TRANSFORMATION MATRIX P. Kustaanheimo and E. Stiefel2 proposed a regularization method by intro- ducing a 4 x 4 transformation matrix and four-component
Caillard, L; Sattayaporn, S; Lamic-Humblot, A-F; Casale, S; Campbell, P; Chabal, Y J; Pluchery, O
2015-02-13
Two types of highly ordered organic layers were prepared on silicon modified with an amine termination for binding gold nanoparticles (AuNPs). These two grafted organic monolayers (GOMs), consisting of alkyl chains with seven or 11 carbon atoms, were grafted on oxide-free Si(111) surfaces as tunnel barriers between the silicon electrode and the AuNPs. Three kinds of colloidal AuNPs were prepared by reducing HAuCl4 with three different reactants: citrate (Turkevich synthesis, diameter ∼16 nm), ascorbic acid (diameter ∼9 nm), or NaBH4 (Natan synthesis, diameter ∼7 nm). Scanning tunnel spectroscopy (STS) was performed in a UHV STM at 40 K, and Coulomb blockade behaviour was observed. The reproducibility of the Coulomb behavior was analysed as a function of several chemical and physical parameters: size, crystallinity of the AuNPs, influence of surrounding surfactant molecules, and quality of the GOM/Si interface (degree of oxidation after the full processing). Samples were characterized with scanning tunneling microscope, STS, atomic force microscope, Fourier transform infrared spectroscopy, x-ray photoelectron spectroscopy (XPS), and high resolution transmission electronic microscope. We show that the reproducibility in observing Coulomb behavior can be as high as ∼80% with the Natan synthesis of AuNPs and GOMs with short alkyl chains.
Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?
NASA Astrophysics Data System (ADS)
Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.
2017-10-01
Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.
ROCS: a Reproducibility Index and Confidence Score for Interaction Proteomics Studies
2012-01-01
Background Affinity-Purification Mass-Spectrometry (AP-MS) provides a powerful means of identifying protein complexes and interactions. Several important challenges exist in interpreting the results of AP-MS experiments. First, the reproducibility of AP-MS experimental replicates can be low, due both to technical variability and the dynamic nature of protein interactions in the cell. Second, the identification of true protein-protein interactions in AP-MS experiments is subject to inaccuracy due to high false negative and false positive rates. Several experimental approaches can be used to mitigate these drawbacks, including the use of replicated and control experiments and relative quantification to sensitively distinguish true interacting proteins from false ones. Methods To address the issues of reproducibility and accuracy of protein-protein interactions, we introduce a two-step method, called ROCS, which makes use of Indicator Prey Proteins to select reproducible AP-MS experiments, and of Confidence Scores to select specific protein-protein interactions. The Indicator Prey Proteins account for measures of protein identifiability as well as protein reproducibility, effectively allowing removal of outlier experiments that contribute noise and affect downstream inferences. The filtered set of experiments is then used in the Protein-Protein Interaction (PPI) scoring step. Prey protein scoring is done by computing a Confidence Score, which accounts for the probability of occurrence of prey proteins in the bait experiments relative to the control experiment, where the significance cutoff parameter is estimated by simultaneously controlling false positives and false negatives against metrics of false discovery rate and biological coherence respectively. In summary, the ROCS method relies on automatic objective criterions for parameter estimation and error-controlled procedures. Results We illustrate the performance of our method by applying it to five previously published AP-MS experiments, each containing well characterized protein interactions, allowing for systematic benchmarking of ROCS. We show that our method may be used on its own to make accurate identification of specific, biologically relevant protein-protein interactions, or in combination with other AP-MS scoring methods to significantly improve inferences. Conclusions Our method addresses important issues encountered in AP-MS datasets, making ROCS a very promising tool for this purpose, either on its own or in conjunction with other methods. We anticipate that our methodology may be used more generally in proteomics studies and databases, where experimental reproducibility issues arise. The method is implemented in the R language, and is available as an R package called “ROCS”, freely available from the CRAN repository http://cran.r-project.org/. PMID:22682516
Schmalreck, A F; Tränkle, P; Vanca, E; Blaschke-Hellmessen, R
1998-01-01
Due to the Fourier-Transform Infrared Spectroscopy (FT-IR) of strain specific traits demonstrated to be a suitable and efficient method for diagnostic and epidemiological determinations for the yeasts Candida albicans, Exophiala dermatitidis and the chlorophylless algae of the genus Prototheca. FT-IR leads in a rapid and economical way to reproducible results according to the spectral differences of intact cells (IR-fingerprints). Different genera, species and sub-species respectively, different strains can be recognized and grouped into different clusters and subclusters. The FT-IR analysis of Candida albicans isolates (n = 150) of 22 newborns-at-risk of an intensive care unit showed, that 86% of the children were colonised with several (2-4) different strains in the oral cavities and faeces. Stationary cross-infections could definitely be determined. Exophiala dermatitidis isolates (n = 31), mostly isolated repetitively within a period of 3 years from sputa of patients suffering from cystic fibrosis could be characterized and grouped patient-specifically over the total sampling period. Of 6 from 8 patients (75%) their individual strains remain the same and could be tracked over the three years. Cross-infections during the stationary treatment could be clearly identified by FT-IR. The Prototheca isolate (n = 43) from live-stock and farm environment showed clear distinguishable clusters differentiating the species P. wickerhamii, P. zopfii and P. stagnora. In addition, the biotypes of P. zopfii could be distinguished, especially the subclusters of variants II and III. It could be demonstrated, that FT-IR is suitable for the routine identification and differentiation of yeasts and algae. However, in spite of the gain of knowledge by using FT-IR for the characterization of microorganisms, the conventional phenotyping and/or genetic analysis of yeast or algae strains cannot be replaced completely. For a final taxonomic classification a combination of conventional methods on FT-IR together with more sophisticated molecular genetic procedures is necessary.
General method for designing wave shape transformers.
Ma, Hua; Qu, Shaobo; Xu, Zhuo; Wang, Jiafu
2008-12-22
An effective method for designing wave shape transformers (WSTs) is investigated by adopting the coordinate transformation theory. Following this method, the devices employed to transform electromagnetic (EM) wave fronts from one style with arbitrary shape and size to another style, can be designed. To verify this method, three examples in 2D spaces are also presented. Compared with the methods proposed in other literatures, this method offers the general procedure in designing WSTs, and thus is of great importance for the potential and practical applications possessed by such kinds of devices.
Reproducibility of abdominal fat assessment by ultrasound and computed tomography
Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge
2017-01-01
Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. PMID:28670024
Ryland, S; Bishea, G; Brun-Conti, L; Eyring, M; Flanagan, B; Jergovich, T; MacDougall, D; Suzuki, E
2001-01-01
The 1990s saw the introduction of significantly new types of paint binder chemistries into the automotive finish coat market. Considering the pronounced changes in the binders that can now be found in automotive paints and their potential use in a wide variety of finishes worldwide, the Paint Subgroup of the Scientific Working Group for Materials (SWGMAT) initiated a validation study to investigate the ability of commonly accepted methods of forensic paint examination to differentiate between these newer types of paints. Nine automotive paint systems typical of original equipment applications were acquired from General Motors Corporation in 1992. They consisted of steel panels coated with typical electrocoat primers and/or primer surfacers followed by a black nonmetallic base coat and clear coat. The primary purpose of this study was to evaluate the discrimination power of common forensic techniques when applied to the newer generation original automotive finishes. The second purpose was to evaluate interlaboratory reproducibility of automotive paint spectra collected on a variety of Fourier transform infrared (FT-IR) spectrometers and accessories normally used for forensic paint examinations. The results demonstrate that infrared spectroscopy is an effective tool for discriminating between the major automotive paint manufacturers' formulation types which are currently used in original finishes. Furthermore, and equally important, the results illustrate that the mid-infrared spectra of these finishes are generally quite reproducible even when comparing data from different laboratories, commercial FT-IR instruments, and accessories in a "real world," mostly uncontrolled, environment.
Cervical vertebral maturation as a biologic indicator of skeletal maturity.
Santiago, Rodrigo César; de Miranda Costa, Luiz Felipe; Vitral, Robert Willer Farinazzo; Fraga, Marcelo Reis; Bolognese, Ana Maria; Maia, Lucianne Cople
2012-11-01
To identify and review the literature regarding the reliability of cervical vertebrae maturation (CVM) staging to predict the pubertal spurt. The selection criteria included cross-sectional and longitudinal descriptive studies in humans that evaluated qualitatively or quantitatively the accuracy and reproducibility of the CVM method on lateral cephalometric radiographs, as well as the correlation with a standard method established by hand-wrist radiographs. The searches retrieved 343 unique citations. Twenty-three studies met the inclusion criteria. Six articles had moderate to high scores, while 17 of 23 had low scores. Analysis also showed a moderate to high statistically significant correlation between CVM and hand-wrist maturation methods. There was a moderate to high reproducibility of the CVM method, and only one specific study investigated the accuracy of the CVM index in detecting peak pubertal growth. This systematic review has shown that the studies on CVM method for radiographic assessment of skeletal maturation stages suffer from serious methodological failures. Better-designed studies with adequate accuracy, reproducibility, and correlation analysis, including studies with appropriate sensitivity-specificity analysis, should be performed.
Comparison of three methods to assess individual skeletal maturity.
Pasciuti, Enzo; Franchi, Lorenzo; Baccetti, Tiziano; Milani, Silvano; Farronato, Giampietro
2013-09-01
The knowledge of facial growth and development is fundamental to determine the optimal timing for different treatment procedures in the growing patient. To analyze the reproducibility of three methods in assessing individual skeletal maturity, and to evaluate any degree of concordance among them. In all, 100 growing subjects were enrolled to test three methods: the hand-wrist, cervical vertebral maturation (CVM), and medial phalanges of the third finger method (MP3). Four operators determined the skeletal maturity of the subjects to evaluate the reproducibility of each method. After 30 days the operators repeated the analysis to assess the repeatability of each method. Finally, one operator examined all subjects' radiographs to detect any concordance among the three methods. The weighted kappa values for inter-operator variability were 0.94, 0.91, and 0.90, for the WRI, CVM, and MP3 methods, respectively. The weighted kappa values for intra-operator variability were 0.92, 0.91, and 0.92, for the WRI, CVM, and MP3 methods, respectively. The three methods revealed a high degree of repeatability and reproducibility. Complete agreement among the three methods was observed in 70% of the analyzed samples. The CVM method has the advantage of not necessitating an additional radiograph. The MP3 method is a simple and practical alternative as it requires only a standard dental x-ray device.
Discrete Fourier transforms of nonuniformly spaced data
NASA Technical Reports Server (NTRS)
Swan, P. R.
1982-01-01
Time series or spatial series of measurements taken with nonuniform spacings have failed to yield fully to analysis using the Discrete Fourier Transform (DFT). This is due to the fact that the formal DFT is the convolution of the transform of the signal with the transform of the nonuniform spacings. Two original methods are presented for deconvolving such transforms for signals containing significant noise. The first method solves a set of linear equations relating the observed data to values defined at uniform grid points, and then obtains the desired transform as the DFT of the uniform interpolates. The second method solves a set of linear equations relating the real and imaginary components of the formal DFT directly to those of the desired transform. The results of numerical experiments with noisy data are presented in order to demonstrate the capabilities and limitations of the methods.
Category's analysis and operational project capacity method of transformation in design
NASA Astrophysics Data System (ADS)
Obednina, S. V.; Bystrova, T. Y.
2015-10-01
The method of transformation is attracting widespread interest in fields such contemporary design. However, in theory of design little attention has been paid to a categorical status of the term "transformation". This paper presents the conceptual analysis of transformation based on the theory of form employed in the influential essays by Aristotle and Thomas Aquinas. In the present work the transformation as a method of shaping design has been explored as well as potential application of this term in design has been demonstrated.
Infrared micro-spectral imaging: distinction of tissue types in axillary lymph node histology
Bird, Benjamin; Miljkovic, Milos; Romeo, Melissa J; Smith, Jennifer; Stone, Nicholas; George, Michael W; Diem, Max
2008-01-01
Background Histopathologic evaluation of surgical specimens is a well established technique for disease identification, and has remained relatively unchanged since its clinical introduction. Although it is essential for clinical investigation, histopathologic identification of tissues remains a time consuming and subjective technique, with unsatisfactory levels of inter- and intra-observer discrepancy. A novel approach for histological recognition is to use Fourier Transform Infrared (FT-IR) micro-spectroscopy. This non-destructive optical technique can provide a rapid measurement of sample biochemistry and identify variations that occur between healthy and diseased tissues. The advantage of this method is that it is objective and provides reproducible diagnosis, independent of fatigue, experience and inter-observer variability. Methods We report a method for analysing excised lymph nodes that is based on spectral pathology. In spectral pathology, an unstained (fixed or snap frozen) tissue section is interrogated by a beam of infrared light that samples pixels of 25 μm × 25 μm in size. This beam is rastered over the sample, and up to 100,000 complete infrared spectra are acquired for a given tissue sample. These spectra are subsequently analysed by a diagnostic computer algorithm that is trained by correlating spectral and histopathological features. Results We illustrate the ability of infrared micro-spectral imaging, coupled with completely unsupervised methods of multivariate statistical analysis, to accurately reproduce the histological architecture of axillary lymph nodes. By correlating spectral and histopathological features, a diagnostic algorithm was trained that allowed both accurate and rapid classification of benign and malignant tissues composed within different lymph nodes. This approach was successfully applied to both deparaffinised and frozen tissues and indicates that both intra-operative and more conventional surgical specimens can be diagnosed by this technique. Conclusion This paper provides strong evidence that automated diagnosis by means of infrared micro-spectral imaging is possible. Recent investigations within the author's laboratory upon lymph nodes have also revealed that cancers from different primary tumours provide distinctly different spectral signatures. Thus poorly differentiated and hard-to-determine cases of metastatic invasion, such as micrometastases, may additionally be identified by this technique. Finally, we differentiate benign and malignant tissues composed within axillary lymph nodes by completely automated methods of spectral analysis. PMID:18759967
Software Development in the Water Sciences: a view from the divide (Invited)
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2013-12-01
While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.
Moghaieb, Reda E A; Sharaf, Ahmed N; Soliman, Mohamed H; El-Arabi, Nagwa I; Momtaz, Osama A
2014-01-01
We present an efficient method for the production of transgenic salt tolerant hexaploid wheat plants expressing the Arabidopsis AtNHX1 gene. Wheat mature zygotic embryos were isolated from two hexaploid bread wheat (Triticum aestivum) cultivars (namely: Gemmeiza 9 and Gemmeiza 10) and were transformed with the A. tumefaciens LBA4404 harboring the pBI-121 vector containing the AtNHX1 gene. Transgenic wheat lines that express the gus intron was obtained and used as control. The results confirmed that npt-II gene could be transmitted and expressed in the T2 following 3:1 Mendelian segregation while the control plant couldn't. The data indicate that, the AtNHX1 gene was integrated in a stable manner into the wheat genome and the corresponding transcripts were expressed. The transformation efficiency was 5.7 and 7.5% for cultivars Gemmeiza 10 and Gemmeiza 9, respectively. A greenhouse experiment was conducted to investigate the effect of AtNHX1 gene in wheat salt tolerance. The transgenic wheat lines could maintain high growth rate under salt stress condition (350 mM NaCl) while the control plant couldn't. The results confirmed that Na(+)/H(+) antiporter gene AtNHX1 increased salt tolerance by increasing Na(+) accumulation and keeping K+/Na(+) balance. Thus, transgenic plants showed high tolerance to salt stress and can be considered as a new genetic resource in breeding programs.
A new anisotropy index on trabecular bone radiographic images using the fast Fourier transform
Brunet-Imbault, Barbara; Lemineur, Gerald; Chappard, Christine; Harba, Rachid; Benhamou, Claude-Laurent
2005-01-01
Background The degree of anisotropy (DA) on radiographs is related to bone structure, we present a new index to assess DA. Methods In a region of interest from calcaneus radiographs, we applied a Fast Fourier Transform (FFT). All the FFT spectra involve the horizontal and vertical components corresponding respectively to longitudinal and transversal trabeculae. By visual inspection, we measured the spreading angles: Dispersion Longitudinal Index (DLI) and Dispersion Transverse Index (DTI) and calculated DA = 180/(DLI+DTI). To test the reliability of DA assessment, we synthesized images simulating radiological projections of periodic structures with elements more or less disoriented. Results Firstly, we tested synthetic images which comprised a large variety of structures from highly anisotropic structure to the almost isotropic, DA was ranging from 1.3 to 3.8 respectively. The analysis of the FFT spectra was performed by two observers, the Coefficients of Variation were 1.5% and 3.1 % for intra-and inter-observer reproducibility, respectively. In 22 post-menopausal women with osteoporotic fracture cases and 44 age-matched controls, DA values were respectively 1.87 ± 0.15 versus 1.72 ± 0.18 (p = 0.001). From the ROC analysis, the Area Under Curve (AUC) were respectively 0.65, 0.62, 0.64, 0.77 for lumbar spine, femoral neck, total femoral BMD and DA. Conclusion The highest DA values in fracture cases suggest that the structure is more anisotropic in osteoporosis due to preferential deletion of trabeculae in some directions. PMID:15927072
Application of Time-Frequency Domain Transform to Three-Dimensional Interpolation of Medical Images.
Lv, Shengqing; Chen, Yimin; Li, Zeyu; Lu, Jiahui; Gao, Mingke; Lu, Rongrong
2017-11-01
Medical image three-dimensional (3D) interpolation is an important means to improve the image effect in 3D reconstruction. In image processing, the time-frequency domain transform is an efficient method. In this article, several time-frequency domain transform methods are applied and compared in 3D interpolation. And a Sobel edge detection and 3D matching interpolation method based on wavelet transform is proposed. We combine wavelet transform, traditional matching interpolation methods, and Sobel edge detection together in our algorithm. What is more, the characteristics of wavelet transform and Sobel operator are used. They deal with the sub-images of wavelet decomposition separately. Sobel edge detection 3D matching interpolation method is used in low-frequency sub-images under the circumstances of ensuring high frequency undistorted. Through wavelet reconstruction, it can get the target interpolation image. In this article, we make 3D interpolation of the real computed tomography (CT) images. Compared with other interpolation methods, our proposed method is verified to be effective and superior.
Shaban-Nejad, Arash; Haarslev, Volker
2015-01-01
The issue of ontology evolution and change management is inadequately addressed by available tools and algorithms, mostly due to the lack of suitable knowledge representation formalisms to deal with temporal abstract notations and the overreliance on human factors. Also most of the current approaches have been focused on changes within the internal structure of ontologies and interactions with other existing ontologies have been widely neglected. In our research, after revealing and classifying some of the common alterations in a number of popular biomedical ontologies, we present a novel agent-based framework, Represent, Legitimate and Reproduce (RLR), to semi-automatically manage the evolution of bio-ontologies, with emphasis on the FungalWeb Ontology, with minimal human intervention. RLR assists and guides ontology engineers through the change management process in general and aids in tracking and representing the changes, particularly through the use of category theory and hierarchical graph transformation.
Impacts of curricular change: Implications from 8 years of data in introductory physics
NASA Astrophysics Data System (ADS)
Pollock, Steven J.; Finkelstein, Noah
2013-01-01
Introductory calculus-based physics classes at the University of Colorado Boulder were significantly transformed beginning in 2004. They now regularly include: interactive engagement using clickers in large lecture settings, Tutorials in Introductory Physics with use of undergraduate Learning Assistants in recitation sections, and a staffed help-room setting where students work on personalized CAPA homework. We compile and summarize conceptual (FMCE and BEMA) pre- and post-data from over 9,000 unique students after 16 semesters of both Physics 1 and 2. Within a single institution with stable pre-test scores, we reproduce results of Hake's 1998 study that demonstrate the positive impacts of interactive engagement on student performance. We link the degree of faculty's use of interactive engagement techniques and their experience levels on student outcomes, and argue for the role of such systematic data collection in sustained course and institutional transformations.
Cytoskeleton-centric protein transportation by exosomes transforms tumor-favorable macrophages.
Chen, Zhipeng; Yang, Lijuan; Cui, Yizhi; Zhou, Yanlong; Yin, Xingfeng; Guo, Jiahui; Zhang, Gong; Wang, Tong; He, Qing-Yu
2016-10-11
The exosome is a key initiator of pre-metastatic niche in numerous cancers, where macrophages serve as primary inducers of tumor microenvironment. However, the proteome that can be exosomally transported from cancer cells to macrophages has not been sufficiently characterized so far. Here, we used colorectal cancer (CRC) exosomes to educate tumor-favorable macrophages. With a SILAC-based mass spectrometry strategy, we successfully traced the proteome transported from CRC exosomes to macrophages. Such a proteome primarily focused on promoting cytoskeleton rearrangement, which was biologically validated with multiple cell lines. We reproduced the exosomal transportation of functional vimentin as a proof-of-concept example. In addition, we found that some CRC exosomes could be recognized by macrophages via Fc receptors. Therefore, we revealed the active and necessary role of exosomes secreted from CRC cells to transform cancer-favorable macrophages, with the cytoskeleton-centric proteins serving as the top functional unit.
General relativity as the effective theory of GL(4,R) spontaneous symmetry breaking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomboulis, E. T.
2011-10-15
We assume a GL(4,R) space-time symmetry which is spontaneously broken to SO(3,1). We carry out the coset construction of the effective theory for the nonlinearly realized broken symmetry in terms of the Goldstone fields and matter fields transforming linearly under the unbroken Lorentz subgroup. We then identify functions of the Goldstone and matter fields that transform linearly also under the broken symmetry. Expressed in terms of these quantities the effective theory reproduces the vierbein formalism of general relativity with general coordinate invariance being automatically realized nonlinearly over GL(4,R). The coset construction makes no assumptions about any underlying theory that mightmore » be responsible for the assumed symmetry breaking. We give a brief discussion of the possibility of field theories with GL(4,R) rather than Lorentz space-time symmetry providing the underlying dynamics.« less
Gabbard, Carl; Lee, Jihye; Caçola, Priscila
2013-01-01
This study examined the role of visual working memory when transforming visual representations to motor representations in the context of motor imagery. Participants viewed randomized number sequences of three, four, and five digits, and then reproduced the sequence by finger tapping using motor imagery or actually executing the movements; movement duration was recorded. One group viewed the stimulus for three seconds and responded immediately, while the second group had a three-second view followed by a three-second blank screen delay before responding. As expected, delay group times were longer with each condition and digit load. Whereas correlations between imagined and executed actions (temporal congruency) were significant in a positive direction for both groups, interestingly, the delay group's values were significantly stronger. That outcome prompts speculation that delay influenced the congruency between motor representation and actual execution.
IIB supergravity and the E 6(6) covariant vector-tensor hierarchy
Ciceri, Franz; de Wit, Bernard; Varela, Oscar
2015-04-20
IIB supergravity is reformulated with a manifest local USp(8) invariance that makes the embedding of five-dimensional maximal supergravities transparent. In this formulation the ten-dimensional theory exhibits all the 27 one-form fields and 22 of the 27 two-form fields that are required by the vector-tensor hierarchy of the five-dimensional theory. The missing 5 two-form fields must transform in the same representation as a descendant of the ten-dimensional ‘dual graviton’. The invariant E 6(6) symmetric tensor that appears in the vector-tensor hierarchy is reproduced. Generalized vielbeine are derived from the supersymmetry transformations of the vector fields, as well as consistent expressions formore » the USp(8) covariant fermion fields. Implications are further discussed for the consistency of the truncation of IIB supergravity compactified on the five-sphere to maximal gauged supergravity in five space-time dimensions with an SO(6) gauge group.« less
Chen, Gang; Zhu, Jun-Yi; Zhang, Zhi-Ling; Zhang, Wei; Ren, Jian-Gang; Wu, Min; Hong, Zheng-Yuan; Lv, Cheng; Pang, Dai-Wen; Zhao, Yi-Fang
2015-01-12
Cell-derived microparticles (MPs) have been recently recognized as critical intercellular information conveyors. However, further understanding of their biological behavior and potential application has been hampered by the limitations of current labeling techniques. Herein, a universal donor-cell-assisted membrane biotinylation strategy was proposed for labeling MPs by skillfully utilizing the natural membrane phospholipid exchange of their donor cells. This innovative strategy conveniently led to specific, efficient, reproducible, and biocompatible quantum dot (QD) labeling of MPs, thereby reliably conferring valuable traceability on MPs. By further loading with small interference RNA, QD-labeled MPs that had inherent cell-targeting and biomolecule-conveying ability were successfully employed for combined bioimaging and tumor-targeted therapy. This study provides the first reliable and biofriendly strategy for transforming biogenic MPs into functionalized nanovectors. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Linnaeans outdoors: the transformative role of studying nature 'on the move' and outside.
Hodacs, Hanna
2011-06-01
Travelling is an activity closely associated with Carolus Linnaeus (1707-1778) and his circle of students. This article discusses the transformative role of studying nature outdoors (turning novices into naturalists) in eighteenth-century Sweden, using the little-known journeys of Carl Bäck (1760-1776), Sven Anders Hedin (1750-1821) and Johan Lindwall (1743-1796) as examples. On these journeys, through different parts of Sweden in the 1770s, the outdoors was used, simultaneously, as both a classroom and a space for exploration. The article argues that this multifunctional use of the landscape (common within the Linnaean tradition) encouraged a democratization of the consumption of scientific knowledge and also, to some degree, of its production. More generally, the study also addresses issues of how and why science and scientists travel by discussing how botanical knowledge was reproduced and extended 'on the move', and what got senior and junior students moving.
Reproducibility of a four-point clinical severity score for glabellar frown lines.
Honeck, P; Weiss, C; Sterry, W; Rzany, B
2003-08-01
Focal injections of botulinum toxin A are used successfully for the treatment of hyperkinetic facial wrinkles. Efficacy can be measured by several methods. However, so far none has been investigated for its reproducibility. Objectives To investigate the reproducibility of a clinical 0-3 score for glabellar frown lines. In the first part of the study, a standardized photographic documentation of glabellar frown lines was produced. Based on the results of this phase, a consensus atlas of glabellar frown lines was developed and participants were trained using this atlas. In the main study, 50 standardized photographs were shown on two consecutive days to 28 dermatologists. The reproducibility of the score was investigated by conventional kappa statistics. In the main study, we found an unweighted kappa according to Fleiss of 0.62 for interobserver reproducibility. Intraobserver reproducibility showed an unweighted kappa according to Cohen of between 0.57 and 0.91 for each observer, and a weighted kappa according to Cicchetti and Allison of between 0.68 and 0.94. The clinical 0-3 score for glabellar frown lines shows a good inter- and intraobserver reproducibility.
Reproducibility and Prognosis of Quantitative Features Extracted from CT Images12
Balagurunathan, Yoganand; Gu, Yuhua; Wang, Hua; Kumar, Virendra; Grove, Olya; Hawkins, Sam; Kim, Jongphil; Goldgof, Dmitry B; Hall, Lawrence O; Gatenby, Robert A; Gillies, Robert J
2014-01-01
We study the reproducibility of quantitative imaging features that are used to describe tumor shape, size, and texture from computed tomography (CT) scans of non-small cell lung cancer (NSCLC). CT images are dependent on various scanning factors. We focus on characterizing image features that are reproducible in the presence of variations due to patient factors and segmentation methods. Thirty-two NSCLC nonenhanced lung CT scans were obtained from the Reference Image Database to Evaluate Response data set. The tumors were segmented using both manual (radiologist expert) and ensemble (software-automated) methods. A set of features (219 three-dimensional and 110 two-dimensional) was computed, and quantitative image features were statistically filtered to identify a subset of reproducible and nonredundant features. The variability in the repeated experiment was measured by the test-retest concordance correlation coefficient (CCCTreT). The natural range in the features, normalized to variance, was measured by the dynamic range (DR). In this study, there were 29 features across segmentation methods found with CCCTreT and DR ≥ 0.9 and R2Bet ≥ 0.95. These reproducible features were tested for predicting radiologist prognostic score; some texture features (run-length and Laws kernels) had an area under the curve of 0.9. The representative features were tested for their prognostic capabilities using an independent NSCLC data set (59 lung adenocarcinomas), where one of the texture features, run-length gray-level nonuniformity, was statistically significant in separating the samples into survival groups (P ≤ .046). PMID:24772210
Joint groupwise registration and ADC estimation in the liver using a B-value weighted metric.
Sanz-Estébanez, Santiago; Rabanillo-Viloria, Iñaki; Royuela-Del-Val, Javier; Aja-Fernández, Santiago; Alberola-López, Carlos
2018-02-01
The purpose of this work is to develop a groupwise elastic multimodal registration algorithm for robust ADC estimation in the liver on multiple breath hold diffusion weighted images. We introduce a joint formulation to simultaneously solve both the registration and the estimation problems. In order to avoid non-reliable transformations and undesirable noise amplification, we have included appropriate smoothness constraints for both problems. Our metric incorporates the ADC estimation residuals, which are inversely weighted according to the signal content in each diffusion weighted image. Results show that the joint formulation provides a statistically significant improvement in the accuracy of the ADC estimates. Reproducibility has also been measured on real data in terms of the distribution of ADC differences obtained from different b-values subsets. The proposed algorithm is able to effectively deal with both the presence of motion and the geometric distortions, increasing accuracy and reproducibility in diffusion parameters estimation. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Liyin; Wang, Zhen-guo, E-mail: wangzhenguo-wzg@163.com; Li, Qinglian
2015-09-07
Phase Doppler anemometry was applied to investigate the atomization processes of a kerosene jet injected into Ma = 1.86 crossflow. Physical behaviors, such as breakup and coalescence, are reproduced through the analysis of the spatial distribution of kerosene droplets' size. It is concluded that Sauter mean diameter distribution shape transforms into “I” type from “C” type as the atomization development. Simultaneously, the breakup of large droplets and the coalescence of small droplets can be observed throughout the whole atomization process.
NASA Astrophysics Data System (ADS)
Rab, George T.
1988-02-01
Three-dimensional human motion analysis has been used for complex kinematic description of abnormal gait in children with neuromuscular disease. Multiple skin markers estimate skeletal segment position, and a sorting and smoothing routine provides marker trajectories. The position and orientation of the moving skeleton in space are derived mathematically from the marker positions, and joint motions are calculated from the Eulerian transformation matrix between linked proximal and distal skeletal segments. Reproduceability has been excellent, and the technique has proven to be a useful adjunct to surgical planning.
New Equation for Prediction of Martensite Start Temperature in High Carbon Ferrous Alloys
NASA Astrophysics Data System (ADS)
Park, Jihye; Shim, Jae-Hyeok; Lee, Seok-Jae
2018-02-01
Since previous equations fail to predict M S temperature of high carbon ferrous alloys, we first propose an equation for prediction of M S temperature of ferrous alloys containing > 2 wt pct C. The presence of carbides (Fe3C and Cr-rich M 7C3) is thermodynamically considered to estimate the C concentration in austenite. Especially, equations individually specialized for lean and high Cr alloys very accurately reproduce experimental results. The chemical driving force for martensitic transformation is quantitatively analyzed based on the calculation of T 0 temperature.
Transforming US Overseas Military Presence: Evidence and Options for DoD. Volume I: Main Report
2002-07-01
conducted under contract DASW01 98 C 0067, Task BE-6-2046, for the Office of the Under Secretary of Defense for Personnel and Readiness. The publication...material may be reproduced by or for the U.S. Government pursuant to the copyright license under the clause at DFARS 252.227-7013 (NOV 95). I N S T I T U...the Office of the Under Secretary of Defense (Personnel and Readiness). The task, entitled “Effects-Based Assessments of US Presence and Deployment
Development of Android apps for cognitive assessment of dementia and delirium.
Weir, Alexander J; Paterson, Craig A; Tieges, Zoe; MacLullich, Alasdair M; Parra-Rodriguez, Mario; Della Sala, Sergio; Logie, Robert H
2014-01-01
The next generation of medical technology applications for hand-held portable platforms will provide a core change in performance and sophistication, transforming the way health care professionals interact with patients. This advance is particularly apparent in the delivery of cognitive patient assessments, where smartphones and tablet computers are being used to assess complex neurological conditions to provide objective, accurate and reproducible test results. This paper reports on two such applications (apps) that have been developed to assist healthcare professionals with the detection and diagnosis of dementia and delirium.
NASA Astrophysics Data System (ADS)
ST Fleur, S.; Courboulex, F.; Bertrand, E.; Mercier De Lepinay, B. F.; Hough, S. E.; Boisson, D.; Momplaisir, R.
2017-12-01
To assess the possible impact of a future earthquake in the urban area of Port-au-Prince (Haiti), we have implemented a simulation approach for complex ground motions produced by an earthquake. To this end, we have integrated local site effect in the prediction of strong ground motions in Port-au-Prince using the complex transfer functions method, which takes into account amplitude changes as well as phase changes. This technique is particularly suitable for basins where a conventional 1D digital approach proves inadequate, as is the case in Port-au-Prince. To do this, we use the results of the Standard Spectral Ratio (SSR) approach of St Fleur et al. (2016) to estimate the amplitude of the response of the site to a nearby rock site. Then, we determine the phase difference between sites, interpreted as changes in the phase of the signal related to local site conditions, using the signals of the 2010 earthquake aftershocks records. Finally, the accelerogram of the simulated earthquake is obtain using the technique of the inverse Fourier transform. The results of this study showed that the strongest soil motions are expected in neighborhoods of downtown Port-au-Prince and adjacent hills. In addition, this simulation method by complex transfer functions was validated by comparison with recorded actual data. Our simulated response spectra reproduce very well both the amplitude and the shape of the response spectra of recorded earthquakes. This new approach allowed to reproduce the lengthening of the signal that could be generated by surface waves at certain stations in the city of Port-au-Prince. However, two points of vigilance must be considered: (1) a good signal-to-noise ratio is necessary to obtain a robust estimate of the site-reference phase shift (ratio at least equal to 10); (2) unless the amplitude and phase changes are measured on strong motion records, this technique does not take non-linear effects into account.
NASA Astrophysics Data System (ADS)
Kemp, C.; Car, N. J.
2016-12-01
Geoscience Australia (GA) is a government agency that provides advice on the geology and geography of Australia. It is the custodian of many digital and physical datasets of national significance. For several years GA has been implementing an enterprise approach to provenance management. The goal for transparency and reproducibility for all of GA's information products; an objective supported at the highest levels and explicitly listed in its Science Principles. Currently GA is finalising a set of enterprise tools to assist with provenance management and rolling out provenance reporting to different science areas. GA has adopted or developed: provenance storage systems; provenance collection code libraries (for use within automated systems); reporting interfaces (for manual use) and provenance representation capability within legacy catalogues. Using these tools within GA's science areas involves modelling the scenario first and then assessing whether the area has its data managed in such a way that allows links to data within provenance to be resolvable in perpetuity. We don't just want to represent provenance (demonstrating transparency), we want to access data via provenance (allowing for reproducibility). A subtask of GA's current work is to link physical samples to information products (datasets, reports, papers) by uniquely and persistently identifying samples using International GeoSample Numbers and then modelling automated & manual laboratory workflows and associated tasks, such as data delivery to corporate databases using the W3C's PROV Data Model. We use PROV DM throughout our modelling and systems. We are also moving to deliver all sample and digital dataset metadata across the agency in the Web Ontology Language (OWL) and exposing it via Linked Data methods in order to allow Semantic Web querying of multiple systems allowing provenance to be leveraged using as a single method and query point. Through the Science First Transformation Program GA is undergoing a significant rethinking of its data architecture, curation and access to support the Digital Science capability for which Provenance management is an output.
Procedures for estimating confidence intervals for selected method performance parameters.
McClure, F D; Lee, J K
2001-01-01
Procedures for estimating confidence intervals (CIs) for the repeatability variance (sigmar2), reproducibility variance (sigmaR2 = sigmaL2 + sigmar2), laboratory component (sigmaL2), and their corresponding standard deviations sigmar, sigmaR, and sigmaL, respectively, are presented. In addition, CIs for the ratio of the repeatability component to the reproducibility variance (sigmar2/sigmaR2) and the ratio of the laboratory component to the reproducibility variance (sigmaL2/sigmaR2) are also presented.
Hong-Seng, Gan; Sayuti, Khairil Amir; Karim, Ahmad Helmy Abdul
2017-01-01
Existing knee cartilage segmentation methods have reported several technical drawbacks. In essence, graph cuts remains highly susceptible to image noise despite extended research interest; active shape model is often constraint by the selection of training data while shortest path have demonstrated shortcut problem in the presence of weak boundary, which is a common problem in medical images. The aims of this study is to investigate the capability of random walks as knee cartilage segmentation method. Experts would scribble on knee cartilage image to initialize random walks segmentation. Then, reproducibility of the method is assessed against manual segmentation by using Dice Similarity Index. The evaluation consists of normal cartilage and diseased cartilage sections which is divided into whole and single cartilage categories. A total of 15 normal images and 10 osteoarthritic images were included. The results showed that random walks method has demonstrated high reproducibility in both normal cartilage (observer 1: 0.83±0.028 and observer 2: 0.82±0.026) and osteoarthritic cartilage (observer 1: 0.80±0.069 and observer 2: 0.83±0.029). Besides, results from both experts were found to be consistent with each other, suggesting the inter-observer variation is insignificant (Normal: P=0.21; Diseased: P=0.15). The proposed segmentation model has overcame technical problems reported by existing semi-automated techniques and demonstrated highly reproducible and consistent results against manual segmentation method.
Chaplais, Elodie; Greene, David; Hood, Anita; Telfer, Scott; du Toit, Verona; Singh-Grewal, Davinder; Burns, Joshua; Rome, Keith; Schiferl, Daniel J; Hendry, Gordon J
2014-07-19
Peripheral quantitative computed tomography (pQCT) is an established technology that allows for the measurement of the material properties of bone. Alterations to bone architecture are associated with an increased risk of fracture. Further pQCT research is necessary to identify regions of interest that are prone to fracture risk in people with chronic diseases. The second metatarsal is a common site for the development of insufficiency fractures, and as such the aim of this study was to assess the reproducibility of a novel scanning protocol of the second metatarsal using pQCT. Eleven embalmed cadaveric leg specimens were scanned six times; three times with and without repositioning. Each foot was positioned on a custom-designed acrylic foot plate to permit unimpeded scans of the region of interest. Sixty-six scans were obtained at 15% (distal) and 50% (mid shaft) of the second metatarsal. Voxel size and scan speed were reduced to 0.40 mm and 25 mm.sec(-1). The reference line was positioned at the most distal portion of the 2(nd) metatarsal. Repeated measurements of six key variables related to bone properties were subject to reproducibility testing. Data were log transformed and reproducibility of scans were assessed using intraclass correlation coefficients (ICC) and coefficients of variation (CV%). Reproducibility of the measurements without repositioning were estimated as: trabecular area (ICC 0.95; CV% 2.4), trabecular density (ICC 0.98; CV% 3.0), Strength Strain Index (SSI) - distal (ICC 0.99; CV% 5.6), cortical area (ICC 1.0; CV% 1.5), cortical density (ICC 0.99; CV% 0.1), SSI - mid shaft (ICC 1.0; CV% 2.4). Reproducibility of the measurements after repositioning were estimated as: trabecular area (ICC 0.96; CV% 2.4), trabecular density (ICC 0.98; CV% 2.8), SSI - distal (ICC 1.0; CV% 3.5), cortical area (ICC 0.99; CV%2.4), cortical density (ICC 0.98; CV% 0.8), SSI - mid shaft (ICC 0.99; CV% 3.2). The scanning protocol generated excellent reproducibility for key bone properties measured at the distal and mid-shaft regions of the 2(nd) metatarsal. This protocol extends the capabilities of pQCT to evaluate bone quality in people who may be at an increased risk of metatarsal insufficiency fractures.
A method for calibrating pH meters using standard solutions with low electrical conductivity
NASA Astrophysics Data System (ADS)
Rodionov, A. K.
2011-07-01
A procedure for obtaining standard solutions with low electrical conductivity that reproduce pH values both in acid and alkali regions is proposed. Estimates of the maximal possible error of reproducing the pH values of these solutions are obtained.
Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie
2015-01-01
It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.
2015-01-01
It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634
Alépée, N; Bessou-Touya, S; Cotovio, J; de Smedt, A; de Wever, B; Faller, C; Jones, P; Le Varlet, B; Marrec-Fairley, M; Pfannenbecker, U; Tailhardat, M; van Goethem, F; McNamee, P
2013-08-01
Cosmetics Europe, The Personal Care Association, known as Colipa before 2012, conducted a program of technology transfer and assessment of Within/Between Laboratory (WLV/BLV) reproducibility of the SkinEthic™ Reconstituted Human Corneal Epithelium (HCE) as one of two human reconstructed tissue eye irritation test methods. The SkinEthic™ HCE test method involves two exposure time treatment procedures - one for short time exposure (10 min - SE) and the other for long time exposure (60 min - LE) of tissues to test substance. This paper describes pre-validation studies of the SkinEthic™ HCE test method (SE and LE protocols) as well as the Eye Peptide Reactivity Assay (EPRA). In the SE WLV study, 30 substances were evaluated. A consistent outcome with respect to viability measurement across all runs was observed with all substances showing an SD of less than 18%. In the LE WLV study, 44 out of 45 substances were consistently classified. These data demonstrated a high level of reproducibility within laboratory for both the SE and LE treatment procedures. For the LE BLV, 19 out of 20 substances were consistently classified between the three laboratories, again demonstrating a high level of reproducibility between laboratories. The results for EPRA WLV and BLV studies demonstrated that all substances analysed were categorised similarly and that the method is reproducible. The SkinEthic™ HCE test method entered into the experimental phase of a formal ECVAM validation program in 2010. Copyright © 2013. Published by Elsevier Ltd.
Chen, Chih-Hao; Hsu, Chueh-Lin; Huang, Shih-Hao; Chen, Shih-Yuan; Hung, Yi-Lin; Chen, Hsiao-Rong; Wu, Yu-Chung
2015-01-01
Although genome-wide expression analysis has become a routine tool for gaining insight into molecular mechanisms, extraction of information remains a major challenge. It has been unclear why standard statistical methods, such as the t-test and ANOVA, often lead to low levels of reproducibility, how likely applying fold-change cutoffs to enhance reproducibility is to miss key signals, and how adversely using such methods has affected data interpretations. We broadly examined expression data to investigate the reproducibility problem and discovered that molecular heterogeneity, a biological property of genetically different samples, has been improperly handled by the statistical methods. Here we give a mathematical description of the discovery and report the development of a statistical method, named HTA, for better handling molecular heterogeneity. We broadly demonstrate the improved sensitivity and specificity of HTA over the conventional methods and show that using fold-change cutoffs has lost much information. We illustrate the especial usefulness of HTA for heterogeneous diseases, by applying it to existing data sets of schizophrenia, bipolar disorder and Parkinson’s disease, and show it can abundantly and reproducibly uncover disease signatures not previously detectable. Based on 156 biological data sets, we estimate that the methodological issue has affected over 96% of expression studies and that HTA can profoundly correct 86% of the affected data interpretations. The methodological advancement can better facilitate systems understandings of biological processes, render biological inferences that are more reliable than they have hitherto been and engender translational medical applications, such as identifying diagnostic biomarkers and drug prediction, which are more robust. PMID:25793610
Liu, Bailing; Zhang, Fumin; Qu, Xinghua; Shi, Xiaojia
2016-02-18
Coordinate transformation plays an indispensable role in industrial measurements, including photogrammetry, geodesy, laser 3-D measurement and robotics. The widely applied methods of coordinate transformation are generally based on solving the equations of point clouds. Despite the high accuracy, this might result in no solution due to the use of ill conditioned matrices. In this paper, a novel coordinate transformation method is proposed, not based on the equation solution but based on the geometric transformation. We construct characteristic lines to represent the coordinate systems. According to the space geometry relation, the characteristic line scan is made to coincide by a series of rotations and translations. The transformation matrix can be obtained using matrix transformation theory. Experiments are designed to compare the proposed method with other methods. The results show that the proposed method has the same high accuracy, but the operation is more convenient and flexible. A multi-sensor combined measurement system is also presented to improve the position accuracy of a robot with the calibration of the robot kinematic parameters. Experimental verification shows that the position accuracy of robot manipulator is improved by 45.8% with the proposed method and robot calibration.
Liu, Bailing; Zhang, Fumin; Qu, Xinghua; Shi, Xiaojia
2016-01-01
Coordinate transformation plays an indispensable role in industrial measurements, including photogrammetry, geodesy, laser 3-D measurement and robotics. The widely applied methods of coordinate transformation are generally based on solving the equations of point clouds. Despite the high accuracy, this might result in no solution due to the use of ill conditioned matrices. In this paper, a novel coordinate transformation method is proposed, not based on the equation solution but based on the geometric transformation. We construct characteristic lines to represent the coordinate systems. According to the space geometry relation, the characteristic line scan is made to coincide by a series of rotations and translations. The transformation matrix can be obtained using matrix transformation theory. Experiments are designed to compare the proposed method with other methods. The results show that the proposed method has the same high accuracy, but the operation is more convenient and flexible. A multi-sensor combined measurement system is also presented to improve the position accuracy of a robot with the calibration of the robot kinematic parameters. Experimental verification shows that the position accuracy of robot manipulator is improved by 45.8% with the proposed method and robot calibration. PMID:26901203
Methods for transforming and expression screening of filamentous fungal cells with a DNA library
Teter, Sarah; Lamsa, Michael; Cherry, Joel; Ward, Connie
2015-06-02
The present invention relates to methods for expression screening of filamentous fungal transformants, comprising: (a) isolating single colony transformants of a DNA library introduced into E. coli; (b) preparing DNA from each of the single colony E. coli transformants; (c) introducing a sample of each of the DNA preparations of step (b) into separate suspensions of protoplasts of a filamentous fungus to obtain transformants thereof, wherein each transformant contains one or more copies of an individual polynucleotide from the DNA library; (d) growing the individual filamentous fungal transformants of step (c) on selective growth medium, thereby permitting growth of the filamentous fungal transformants, while suppressing growth of untransformed filamentous fungi; and (e) measuring activity or a property of each polypeptide encoded by the individual polynucleotides. The present invention also relates to isolated polynucleotides encoding polypeptides of interest obtained by such methods, to nucleic acid constructs, expression vectors, and recombinant host cells comprising the isolated polynucleotides, and to methods of producing the polypeptides encoded by the isolated polynucleotides.
Reproducibility of ECG-gated ultrasound diameter assessment of small abdominal aortic aneurysms.
Bredahl, K; Eldrup, N; Meyer, C; Eiberg, J E; Sillesen, H
2013-03-01
No standardised ultrasound procedure to obtain reliable growth estimates for abdominal aortic aneurysms (AAA) is currently available. We investigated the feasibility and reproducibility of a novel approach controlling for a combination of vessel wall delineation and cardiac cycle variation. Prospective comparative study. Consecutive patients (N = 27) with an AAA, attending their 6-month control as part of a medical treatment trial, were scanned twice by two ultrasound operators. Then, all ultrasound recordings were transferred to a core facility and analysed by a third person. The AAA diameter was determined in four different ways: from the leading edge of adventitia on the anterior wall to either the leading edge of the adventitia (method A) or leading edge of the intima (method B) on the posterior wall, with both measurements performed in systole and diastole. Inter-operator reproducibility was ± 3 mm for all methods applied. There was no difference in outcome between methods A and B; likewise, end-diastolic measurement did not improve reproducibility in preference to peak-systolic measurement. The use of a standardised ultrasound protocol including ECG-gating and subsequent off-line reading with minute calliper placement reduces variability. This may be of use in developing protocols to better detect even small AAA growth rates during clinical trials. Copyright © 2012 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Assessment of four midcarpal radiologic determinations.
Cho, Mickey S; Battista, Vincent; Dubin, Norman H; Pirela-Cruz, Miguel
2006-03-01
Several radiologic measurement methods have been described for determining static carpal alignment of the wrist. These include the scapholunate, radiolunate, and capitolunate angles. The triangulation method is an alternative radiologic measurement which we believe is easier to use and more reproducible and reliable than the above mentioned methods. The purpose of this study is to assess the intraobserver reproducibility and interobserver reliability of the triangulation method, scapholunate, radiolunate, and capitolunate angles. Twenty orthopaedic residents and staff at varying levels of training made four radiologic measurements including the scapholunate, radiolunate and capitolunate angles as well as the triangulation method on five different lateral, digitized radiographs of the wrist and forearm in neutral radioulnar deviation. Thirty days after the initial measurements, the participants repeated the four radiologic measurements using the same radiographs. The triangulation method had the best intra-and-interobserver agreement of the four methods tested. This agreement was significantly better than the capitolunate and radiolunate angles. The scapholunate angle had the next best intraobserver reproducibility and interobserver reliability. The triangulation method has the best overall observer agreement when compared to the scapholunate, radiolunate, and capitolunate angles in determining static midcarpal alignment. No comment can be made on the validity of the measurements since there is no radiographic gold standard in determining static carpal alignment.
Learning linear transformations between counting-based and prediction-based word embeddings
Hayashi, Kohei; Kawarabayashi, Ken-ichi
2017-01-01
Despite the growing interest in prediction-based word embedding learning methods, it remains unclear as to how the vector spaces learnt by the prediction-based methods differ from that of the counting-based methods, or whether one can be transformed into the other. To study the relationship between counting-based and prediction-based embeddings, we propose a method for learning a linear transformation between two given sets of word embeddings. Our proposal contributes to the word embedding learning research in three ways: (a) we propose an efficient method to learn a linear transformation between two sets of word embeddings, (b) using the transformation learnt in (a), we empirically show that it is possible to predict distributed word embeddings for novel unseen words, and (c) empirically it is possible to linearly transform counting-based embeddings to prediction-based embeddings, for frequent words, different POS categories, and varying degrees of ambiguities. PMID:28926629
Gray, Allan; Wright, Alex; Jackson, Pete; Hale, Mike; Treanor, Darren
2015-03-01
Histochemical staining of tissue is a fundamental technique in tissue diagnosis and research, but it suffers from significant variability. Efforts to address this include laboratory quality controls and quality assurance schemes, but these rely on subjective interpretation of stain quality, are laborious and have low reproducibility. We aimed (1) to develop a method for histochemical stain quantification using whole slide imaging and image analysis and (2) to demonstrate its usefulness in measuring staining variation. A method to quantify the individual stain components of histochemical stains on virtual slides was developed. It was evaluated for repeatability and reproducibility, then applied to control sections of an appendix to quantify H&E staining (H/E intensities and H:E ratio) between automated staining machines and to measure differences between six regional diagnostic laboratories. The method was validated with <0.5% variation in H:E ratio measurement when using the same scanner for a batch of slides (ie, it was repeatable) but was not highly reproducible between scanners or over time, where variation of 7% was found. Application of the method showed H:E ratios between three staining machines varied from 0.69 to 0.93, H:E ratio variation over time was observed. Interlaboratory comparison demonstrated differences in H:E ratio between regional laboratories from 0.57 to 0.89. A simple method using whole slide imaging can be used to quantify and compare histochemical staining. This method could be deployed in routine quality assurance and quality control. Work is needed on whole slide imaging devices to improve reproducibility. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
López-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.
2009-01-01
A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min with a minimized amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from the bacteria Shewanella oneidensis, and mouse plasma, as well as 18O labeling of such complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, rapid, and thus well-suited for automation. PMID:19555078
Reproducibility of biomarkers in induced sputum and in serum from chronic smokers.
Zuiker, Rob G J A; Kamerling, Ingrid M C; Morelli, Nicoletta; Calderon, Cesar; Boot, J Diderik; de Kam, Marieke; Diamant, Zuzana; Burggraaf, Jacobus; Cohen, Adam F
2015-08-01
Soluble inflammatory markers obtained from non-invasive airway sampling such as induced sputum may be useful biomarkers for targeted pharmaceutical interventions. However, before these soluble markers can be used as potential targets, their variability and reproducibility need to be established in distinct study populations. This study aimed to assess the reproducibility of biomarkers obtained from induced sputum and serum in chronic smokers and non-smokers. Sputum and serum samples were obtained from 16 healthy non-smokers and 16 asymptomatic chronic smokers (for both groups: 8M/8F, 30-52 years, FEV1 ≥80% pred.; ≥10 pack years for the smokers) on 2 separate visits 4-10 days apart. Soluble markers in serum and sputum were analysed by ELISA. The differences between smokers vs non-smokers were analysed with a t-test and variability was assessed on log-transformed data by a mixed model ANOVA. Analysable sputum samples could be obtained from all 32 subjects. In both study populations neutrophils and macrophages were the predominant cell types. Serum Pulmonary Surfactant Associated Protein D had favourable reproducibility criteria for reliability ratio (0.99), intra-subject coefficient of variation (11.2%) and the Bland Altman limits of agreement. Furthermore, chronic smokers, compared to non-smokers, had significantly higher sputum concentrations of IL-8 (1094.6 pg/mL vs 460.8 pg/mL, p = 0.006)), and higher serum concentrations of Pulmonary Surfactant Associated Protein D (110.9 pg/mL vs 64.7 pg/mL, p = 0.019), and lower concentrations of Serum Amyloid A (1352.4 pg/mL vs 2297.5 pg/mL, p = 0.022). Serum Pulmonary Surfactant Associated Protein D proved to be a biomarker that fulfilled the criteria for reproducibility in both study groups. Copyright © 2015 Elsevier Ltd. All rights reserved.
On the validity of cosmological Fisher matrix forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolz, Laura; Kilbinger, Martin; Weller, Jochen
2012-09-01
We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function.more » More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.« less
Application of the θ-method to a telegraphic model of fluid flow in a dual-porosity medium
NASA Astrophysics Data System (ADS)
González-Calderón, Alfredo; Vivas-Cruz, Luis X.; Herrera-Hernández, Erik César
2018-01-01
This work focuses mainly on the study of numerical solutions, which are obtained using the θ-method, of a generalized Warren and Root model that includes a second-order wave-like equation in its formulation. The solutions approximately describe the single-phase hydraulic head in fractures by considering the finite velocity of propagation by means of a Cattaneo-like equation. The corresponding discretized model is obtained by utilizing a non-uniform grid and a non-uniform time step. A simple relationship is proposed to give the time-step distribution. Convergence is analyzed by comparing results from explicit, fully implicit, and Crank-Nicolson schemes with exact solutions: a telegraphic model of fluid flow in a single-porosity reservoir with relaxation dynamics, the Warren and Root model, and our studied model, which is solved with the inverse Laplace transform. We find that the flux and the hydraulic head have spurious oscillations that most often appear in small-time solutions but are attenuated as the solution time progresses. Furthermore, we show that the finite difference method is unable to reproduce the exact flux at time zero. Obtaining results for oilfield production times, which are in the order of months in real units, is only feasible using parallel implicit schemes. In addition, we propose simple parallel algorithms for the memory flux and for the explicit scheme.
Image encryption with chaotic map and Arnold transform in the gyrator transform domains
NASA Astrophysics Data System (ADS)
Sang, Jun; Luo, Hongling; Zhao, Jun; Alam, Mohammad S.; Cai, Bin
2017-05-01
An image encryption method combing chaotic map and Arnold transform in the gyrator transform domains was proposed. Firstly, the original secret image is XOR-ed with a random binary sequence generated by a logistic map. Then, the gyrator transform is performed. Finally, the amplitude and phase of the gyrator transform are permutated by Arnold transform. The decryption procedure is the inverse operation of encryption. The secret keys used in the proposed method include the control parameter and the initial value of the logistic map, the rotation angle of the gyrator transform, and the transform number of the Arnold transform. Therefore, the key space is large, while the key data volume is small. The numerical simulation was conducted to demonstrate the effectiveness of the proposed method and the security analysis was performed in terms of the histogram of the encrypted image, the sensitiveness to the secret keys, decryption upon ciphertext loss, and resistance to the chosen-plaintext attack.
Comparison and evaluation on image fusion methods for GaoFen-1 imagery
NASA Astrophysics Data System (ADS)
Zhang, Ningyu; Zhao, Junqing; Zhang, Ling
2016-10-01
Currently, there are many research works focusing on the best fusion method suitable for satellite images of SPOT, QuickBird, Landsat and so on, but only a few of them discuss the application of GaoFen-1 satellite images. This paper proposes a novel idea by using four fusion methods, such as principal component analysis transform, Brovey transform, hue-saturation-value transform, and Gram-Schmidt transform, from the perspective of keeping the original image spectral information. The experimental results showed that the transformed images by the four fusion methods not only retain high spatial resolution on panchromatic band but also have the abundant spectral information. Through comparison and evaluation, the integration of Brovey transform is better, but the color fidelity is not the premium. The brightness and color distortion in hue saturation-value transformed image is the largest. Principal component analysis transform did a good job in color fidelity, but its clarity still need improvement. Gram-Schmidt transform works best in color fidelity, and the edge of the vegetation is the most obvious, the fused image sharpness is higher than that of principal component analysis. Brovey transform, is suitable for distinguishing the Gram-Schmidt transform, and the most appropriate for GaoFen-1 satellite image in vegetation and non-vegetation area. In brief, different fusion methods have different advantages in image quality and class extraction, and should be used according to the actual application information and image fusion algorithm.
Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2011-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.
After two previous investigations demonstrated that the Baffled Flask Test (BFT) was an effective and reproducible method for screening the effectiveness of dispersant products in the laboratory, the USEPA decided that before the new protocol cold be considered for replacement of...
Does Melody Assist in the Reproduction of Novel Rhythm Patterns?
ERIC Educational Resources Information Center
Kinney, Daryl W.; Forsythe, Jere L.
2013-01-01
We examined music education majors' ability to reproduce rhythmic stimuli presented in melody and rhythm only conditions. Participants reproduced rhythms of two-measure music examples by immediately echo-performing through a method of their choosing (e.g., clapping, tapping, vocalizing). Forty examples were presented in melody and rhythm only…
In vivo studies provide reference data to evaluate alternative methods for predicting toxicity. However, the reproducibility and variance of effects observed across multiple in vivo studies is not well understood. The US EPA’s Toxicity Reference Database (ToxRefDB) stores d...
Efficacy Evaluation of Different Wavelet Feature Extraction Methods on Brain MRI Tumor Detection
NASA Astrophysics Data System (ADS)
Nabizadeh, Nooshin; John, Nigel; Kubat, Miroslav
2014-03-01
Automated Magnetic Resonance Imaging brain tumor detection and segmentation is a challenging task. Among different available methods, feature-based methods are very dominant. While many feature extraction techniques have been employed, it is still not quite clear which of feature extraction methods should be preferred. To help improve the situation, we present the results of a study in which we evaluate the efficiency of using different wavelet transform features extraction methods in brain MRI abnormality detection. Applying T1-weighted brain image, Discrete Wavelet Transform (DWT), Discrete Wavelet Packet Transform (DWPT), Dual Tree Complex Wavelet Transform (DTCWT), and Complex Morlet Wavelet Transform (CMWT) methods are applied to construct the feature pool. Three various classifiers as Support Vector Machine, K Nearest Neighborhood, and Sparse Representation-Based Classifier are applied and compared for classifying the selected features. The results show that DTCWT and CMWT features classified with SVM, result in the highest classification accuracy, proving of capability of wavelet transform features to be informative in this application.
1989-08-01
Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S
NASA Astrophysics Data System (ADS)
Schäfer, M.; Groos, L.; Forbriger, T.; Bohlen, T.
2014-09-01
Full-waveform inversion (FWI) of shallow-seismic surface waves is able to reconstruct lateral variations of subsurface elastic properties. Line-source simulation for point-source data is required when applying algorithms of 2-D adjoint FWI to recorded shallow-seismic field data. The equivalent line-source response for point-source data can be obtained by convolving the waveforms with √{t^{-1}} (t: traveltime), which produces a phase shift of π/4. Subsequently an amplitude correction must be applied. In this work we recommend to scale the seismograms with √{2 r v_ph} at small receiver offsets r, where vph is the phase velocity, and gradually shift to applying a √{t^{-1}} time-domain taper and scaling the waveforms with r√{2} for larger receiver offsets r. We call this the hybrid transformation which is adapted for direct body and Rayleigh waves and demonstrate its outstanding performance on a 2-D heterogeneous structure. The fit of the phases as well as the amplitudes for all shot locations and components (vertical and radial) is excellent with respect to the reference line-source data. An approach for 1-D media based on Fourier-Bessel integral transformation generates strong artefacts for waves produced by 2-D structures. The theoretical background for both approaches is presented in a companion contribution. In the current contribution we study their performance when applied to waves propagating in a significantly 2-D-heterogeneous structure. We calculate synthetic seismograms for 2-D structure for line sources as well as point sources. Line-source simulations obtained from the point-source seismograms through different approaches are then compared to the corresponding line-source reference waveforms. Although being derived by approximation the hybrid transformation performs excellently except for explicitly back-scattered waves. In reconstruction tests we further invert point-source synthetic seismograms by a 2-D FWI to subsurface structure and evaluate its ability to reproduce the original structural model in comparison to the inversion of line-source synthetic data. Even when applying no explicit correction to the point-source waveforms prior to inversion only moderate artefacts appear in the results. However, the overall performance is best in terms of model reproduction and ability to reproduce the original data in a 3-D simulation if inverted waveforms are obtained by the hybrid transformation.
A Review of Transformer Aging and Control Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gourisetti, Sri Nikhil Gup; Kirkham, Harold; Sivaraman, Deepak
Transformer aging is an important challenge in power system. Distribution transformers themselves are minimally controllable, but smart meters provide excellent, new insights into electrical loads, which insights can be used to understand and mitigate transformer aging. The nature of transformer loads is changing with the integration of distributed energy resources (DERs) and electric vehicles (EVs). This paper first reviews factors that influence the aging of distribution transformers, including root causes of transformer failure. Existing and proposed load control methods are then discussed. A distribution model is introduced to help evaluate potential control methods.
Effects of Simple Leaching of Crushed and Powdered Materials on High-precision Pb Isotope Analyses
NASA Astrophysics Data System (ADS)
Todd, E.; Stracke, A.
2013-12-01
We present new results of simple leaching experiments on the Pb isotope composition of USGS standard reference material powders and on ocean island basalt whole rock splits and powders. Rock samples were leached with 6N HCl in two steps, first hot and then in an ultrasonic bath, and washed with ultrapure H2O before conventional sample digestion and chromatographic purification of Pb. Pb isotope analyses were determined with Tl-doped MC-ICP-MS. Intra- and inter-session analytical reproducibility of repeated analyses of both synthetic Pb solutions and Pb from single digests of chemically processed natural samples were generally < 100 ppm (2 S.D.). The comparison of leached and unleached samples shows that leaching reliably removes variable amounts of different contaminants for different starting materials. For repeated digests of a single sample, the leached samples reproduce better than the unleached ones, showing that leaching effectively removes heterogeneously distributed extraneous Pb. However, the reproducibility of repeated digests of variably contaminated natural samples is up to an order of magnitude worse than the analytical reproducibility of ca. 100 ppm. More complex leaching methods (e.g., Nobre Silva et al., 2009) yield Pb isotope ratios within error of and with similar reproducibility to our method, showing that the simple leaching method is reliable. The remaining Pb isotope heterogeneity of natural samples, which typically exceeds 100 ppm, is thus attributed to inherent isotopic sample heterogeneity. Tl-doped MC-ICP-MS Pb ratio determination is therefore a sufficiently precise method for Pb isotope analyses in natural rocks. More precise Pb double- or triple-spike methods (e.g., Galer, 1999; Thirlwall, 2000), may exploit their full potential only in cases where natural isotopic sample heterogeneity is demonstrably negligible. References: Galer, S., 1999, Chem. Geol. 157, 255-274. Nobre Silva, et al. 2009, Geochemistry Geophysics Geosystems 10, Q08012. Thirlwall, M.F., 2000, Chem. Geol. 163, 299-322.
Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S
2008-05-09
Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.
A simple and reliable multi-gene transformation method for switchgrass.
Ogawa, Yoichi; Shirakawa, Makoto; Koumoto, Yasuko; Honda, Masaho; Asami, Yuki; Kondo, Yasuhiro; Hara-Nishimura, Ikuko
2014-07-01
A simple and reliable Agrobacterium -mediated transformation method was developed for switchgrass. Using this method, many transgenic plants carrying multiple genes-of-interest could be produced without untransformed escape. Switchgrass (Panicum virgatum L.) is a promising biomass crop for bioenergy. To obtain transgenic switchgrass plants carrying a multi-gene trait in a simple manner, an Agrobacterium-mediated transformation method was established by constructing a Gateway-based binary vector, optimizing transformation conditions and developing a novel selection method. A MultiRound Gateway-compatible destination binary vector carrying the bar selectable marker gene, pHKGB110, was constructed to introduce multiple genes of interest in a single transformation. Two reporter gene expression cassettes, GUSPlus and gfp, were constructed independently on two entry vectors and then introduced into a single T-DNA region of pHKGB110 via sequential LR reactions. Agrobacterium tumefaciens EHA101 carrying the resultant binary vector pHKGB112 and caryopsis-derived compact embryogenic calli were used for transformation experiments. Prolonged cocultivation for 7 days followed by cultivation on media containing meropenem improved transformation efficiency without overgrowth of Agrobacterium, which was, however, not inhibited by cefotaxime or Timentin. In addition, untransformed escape shoots were completely eliminated during the rooting stage by direct dipping the putatively transformed shoots into the herbicide Basta solution for a few seconds, designated as the 'herbicide dipping method'. It was also demonstrated that more than 90 % of the bar-positive transformants carried both reporters delivered from pHKGB112. This simple and reliable transformation method, which incorporates a new selection technique and the use of a MultiRound Gateway-based binary vector, would be suitable for producing a large number of transgenic lines carrying multiple genes.
Comparison of algorithms for computing the two-dimensional discrete Hartley transform
NASA Technical Reports Server (NTRS)
Reichenbach, Stephen E.; Burton, John C.; Miller, Keith W.
1989-01-01
Three methods have been described for computing the two-dimensional discrete Hartley transform. Two of these employ a separable transform, the third method, the vector-radix algorithm, does not require separability. In-place computation of the vector-radix method is described. Operation counts and execution times indicate that the vector-radix method is fastest.
Feasibility of digital image colorimetry--application for water calcium hardness determination.
Lopez-Molinero, Angel; Tejedor Cubero, Valle; Domingo Irigoyen, Rosa; Sipiera Piazuelo, Daniel
2013-01-15
Interpretation and relevance of basic RGB colors in Digital Image-Based Colorimetry have been treated in this paper. The studies were carried out using the chromogenic model formed by the reaction between Ca(II) ions and glyoxal bis(2-hydroxyanil). It produced orange-red colored solutions in alkaline media. Individual basic color data (RGB) and also the total intensity of colors, I(tot), were the original variables treated by Factorial Analysis. Te evaluation evidenced that the highest variance of the system and the highest analytical sensitivity were associated to the G color. However, after the study by Fourier transform the basic R color was recognized as an important feature in the information. It was manifested as an intrinsic characteristic that appeared differentiated in terms of low frequency in Fourier transform. The Principal Components Analysis study showed that the variance of the system could be mostly retained in the first principal component, but was dependent on all basic colors. The colored complex was also applied and validated as a Digital Image Colorimetric method for the determination of Ca(II) ions. RGB intensities were linearly correlated with Ca(II) in the range 0.2-2.0 mg L(-1). In the best conditions, using green color, a simple and reliable method for Ca determination could be developed. Its detection limit was established (criterion 3s) as 0.07 mg L(-1). And the reproducibility was lower than 6%, for 1.0 mg L(-1) Ca. Other chromatic parameters were evaluated as dependent calibration variables. Their representativeness, variance and sensitivity were discussed in order to select the best analytical variable. The potentiality of the procedure as a field and ready-to-use method, susceptible to be applied 'in situ' with a minimum of experimental needs, was probed. Applications of the analysis of Ca in different real water samples were carried out. Water of the city net, mineral bottled, and natural-river were analyzed and results were compared and evaluated statistically. The validity was assessed by the alternative techniques of flame atomic absorption spectroscopy and titrimetry. Differences were appreciated but they were consistent with the applied methods. Copyright © 2012 Elsevier B.V. All rights reserved.
Conductance switching in Ag(2)S devices fabricated by in situ sulfurization.
Morales-Masis, M; van der Molen, S J; Fu, W T; Hesselberth, M B; van Ruitenbeek, J M
2009-03-04
We report a simple and reproducible method to fabricate switchable Ag(2)S devices. The alpha-Ag(2)S thin films are produced by a sulfurization process after silver deposition on an Si substrate. Structure and composition of the Ag(2)S are characterized using XRD and RBS. Our samples show semiconductor behaviour at low bias voltages, whereas they exhibit reproducible bipolar resistance switching at higher bias voltages. The transition between both types of behaviour is observed by hysteresis in the I-V curves, indicating decomposition of the Ag(2)S, increasing the Ag(+) ion mobility. The as-fabricated Ag(2)S samples are a good candidate for future solid state memory devices, as they show reproducible memory resistive properties and they are fabricated by an accessible and reliable method.
Formation of Polar Stratospheric Clouds in the Atmosphere
NASA Astrophysics Data System (ADS)
Aloyan, Artash; Yermakov, Alex; Arutyunyan, Vardan; Larin, Igor
2014-05-01
A new mathematical model of the global transport of gaseous species and aerosols in the atmosphere and the formation of polar stratospheric clouds (PSCs) in both hemispheres was constructed. PSCs play a significant role in ozone chemistry since heterogeneous reactions proceed on their particle surfaces and in the bulk, affecting the gas composition of the atmosphere, specifically, the content of chlorine and nitrogen compounds, which are actively involved in the destruction of ozone. Stratospheric clouds are generated by co-condensation of water vapor and nitric acid on sulfate particles and in some cases during the freezing of supercooled water as well as when nitric acid vapors are dissolved in sulfate aerosol particles [1]. These clouds differ in their chemical composition and microphysics [2]. In this study, we propose new kinetic equations describing the variability of species in the gas and condensed phases to simulate the formation of PSCs. Most models for the formation of PSCs use constant background values of sulfate aerosols in the lower stratosphere. This approach is too simplistic since sulfate aerosols in the stratosphere are characterized by considerably nonuniform spatial and temporal variations. Two PSC types are considered: Type 1 refers to the formation of nitric acid trihydrate (NAT) and Type 2 refers to the formation of particles composed of different proportions of H2SO4/HNO3/H2O. Their formation is coupled with the spatial problem of sulfate aerosol generation in the upper troposphere and lower stratosphere incorporating the chemical and kinetic transformation processes (photochemistry, nucleation, condensation/evaporation, and coagulation) and using a non-equilibrium particle-size distribution [3]. In this formulation, the system of equations is closed and allows an adequate description of the PSC dynamics in the stratosphere. Using the model developed, numerical experiments were performed to reproduce the spatial and temporal variability of polar clouds in both hemispheres for the winter time period. The numerical experiments were performed in the following sequence. In the first stage, we address the transport of multicomponent gaseous species, the formation of sulfate aerosols in the troposphere and lower stratosphere (spherical atmosphere), the chemical and kinetic transformations, and the biogenic and anthropogenic emissions of related chemical components [3]. This model makes it possible to reproduce the distribution of sulfate particles in the size range from 3 nm to 1 mcm. Next, the base model was improved by using a new module describing the dynamics of phase transition of substances in gaseous and condensed phases that are typical for different types of PSCs. Here, we used the methods of thermodynamics. Conclusions •The model developed allow us to reproduce the size distribution of sulfate particles generated from precursor gases in the troposphere and stratosphere; •The numerical experiments show that the model adequately reproduces the spatial characteristics of the PSC formation in the atmosphere. References 1.Carslaw K.S., Peter T., Clegg S.L. Modeling the composition of liquid stratospheric clouds. Rev. Geophys. 35, 125, 1997 2.Drdla, K., Shoeberl, M.R., and Browell, E.V., Microphysical modeling of the 1999-2000 Arctic winter. J. Geophys. Res., 2003, vol. 108, No. D5, p. 8312. 3.Aloyan, A.E., Yermakov, A.N., Arutyunyan, V.O., Sulfate aerosol formation in the troposphere and lower stratosphere, in Possibilities of Climate Stabilization by Using Novel Technologies, Moscow: Rosgidromet, 2012, pp. 75-98.
Reproducibility in Computational Neuroscience Models and Simulations
McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.
2016-01-01
Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845
NASA Astrophysics Data System (ADS)
Passeri, Alessandro; Mazzuca, Stefano; Del Bene, Veronica
2014-06-01
Clinical magnetic resonance spectroscopy imaging (MRSI) is a non-invasive functional technique, whose mathematical framework falls into the category of linear inverse problems. However, its use in medical diagnostics is hampered by two main problems, both linked to the Fourier-based technique usually implemented for spectra reconstruction: poor spatial resolution and severe blurring in the spatial localization of the reconstructed spectra. Moreover, the intrinsic ill-posedness of the MRSI problem might be worsened by (i) spatially dependent distortions of the static magnetic field (B0) distribution, as well as by (ii) inhomogeneity in the power deposition distribution of the radiofrequency magnetic field (B1). Among several alternative methods, slim (Spectral Localization by IMaging) and bslim (B0 compensated slim) are reconstruction algorithms in which a priori information concerning the spectroscopic target is introduced into the reconstruction kernel. Nonetheless, the influence of the B1 field, particularly when its operating wavelength is close to the size of the human organs being studied, continues to be disregarded. starslim (STAtic and Radiofrequency-compensated slim), an evolution of the slim and bslim methods, is therefore proposed, in which the transformation kernel also includes the B1 field inhomogeneity map, thus allowing almost complete 3D modelling of the MRSI problem. Moreover, an original method for the experimental determination of the B1 field inhomogeneity map specific to the target under evaluation is also included. The compensation capabilities of the proposed method have been tested and illustrated using synthetic raw data reproducing the human brain.
A Phase-Space Approach to Collisionless Stellar Systems Using a Particle Method
NASA Astrophysics Data System (ADS)
Hozumi, Shunsuke
1997-10-01
A particle method for reproducing the phase space of collisionless stellar systems is described. The key idea originates in Liouville's theorem, which states that the distribution function (DF) at time t can be derived from tracing necessary orbits back to t = 0. To make this procedure feasible, a self-consistent field (SCF) method for solving Poisson's equation is adopted to compute the orbits of arbitrary stars. As an example, for the violent relaxation of a uniform density sphere, the phase-space evolution generated by the current method is compared to that obtained with a phase-space method for integrating the collisionless Boltzmann equation, on the assumption of spherical symmetry. Excellent agreement is found between the two methods if an optimal basis set for the SCF technique is chosen. Since this reproduction method requires only the functional form of initial DFs and does not require any assumptions to be made about the symmetry of the system, success in reproducing the phase-space evolution implies that there would be no need of directly solving the collisionless Boltzmann equation in order to access phase space even for systems without any special symmetries. The effects of basis sets used in SCF simulations on the reproduced phase space are also discussed.
Reproducibility in Data-Scarce Environments
NASA Astrophysics Data System (ADS)
Darch, P. T.
2016-12-01
Among the usual requirements for reproducibility are large volumes of data and computationally intensive methods. Many fields within earth sciences, however, do not meet these requirements. Data are scarce and data-intensive methods are not well established. How can science be reproducible under these conditions? What changes, both infrastructural and cultural, are needed to advance reproducibility? This paper presents findings from a long-term social scientific case study of an emergent and data scarce field, the deep subseafloor biosphere. This field studies interactions between microbial communities living in the seafloor and the physical environments they inhabit. Factors such as these make reproducibility seem a distant goal for this community: - The relative newness of the field. Serious study began in the late 1990s; - The highly multidisciplinary nature of the field. Researchers come from a range of physical and life science backgrounds; - Data scarcity. Domain researchers produce much of these data in their own onshore laboratories by analyzing cores from international ocean drilling expeditions. Allocation of cores is negotiated between researchers from many fields. These factors interact in multiple ways to inhibit reproducibility: - Incentive structures emphasize producing new data and new knowledge rather than reanalysing extant data; - Only a few steps of laboratory analyses can be reproduced - such as analysis of DNA sequences, but not extraction of DNA from cores -, due to scarcity of cores; - Methodological heterogeneity is a consequence of multidisciplinarity, as researchers bring different techniques from diverse fields. - Few standards for data collection or analysis are available at this early stage of the field; - While datasets from multiple biological and physical phenomena can be integrated into a single workflow, curation tends to be divergent. Each type of dataset may be subject to different disparate policies and contributed to different databases. Our study demonstrates that data scarcity can be particularly acute in emerging scientific fields, and often results from resource scarcity more generally. Reproducibility tends to be a low priority among the many other scientific challenges they face.
Comparing transformation methods for DNA microarray data
Thygesen, Helene H; Zwinderman, Aeilko H
2004-01-01
Background When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects), and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. Results We used the ratio between biological variance and measurement variance (which is an F-like statistic) as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. Conclusions The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method. PMID:15202953
Comparing transformation methods for DNA microarray data.
Thygesen, Helene H; Zwinderman, Aeilko H
2004-06-17
When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects), and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. We used the ratio between biological variance and measurement variance (which is an F-like statistic) as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven
The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less
Verification of the ideal magnetohydrodynamic response at rational surfaces in the VMEC code
Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven; ...
2016-01-13
The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less
Tripathi, Jaindra N; Oduor, Richard O; Tripathi, Leena
2015-01-01
Banana (Musa spp.) is an important staple food as well as cash crop in tropical and subtropical countries. Various bacterial, fungal, and viral diseases and pests such as nematodes are major constraints in its production and are currently destabilizing the banana production in sub-Saharan Africa. Genetic engineering is a complementary option used for incorporating useful traits in banana to bypass the long generation time, polyploidy, and sterility of most of the cultivated varieties. A robust transformation protocol for farmer preferred varieties is crucial for banana genomics and improvement. A robust and reproducible system for genetic transformation of banana using embryogenic cell suspensions (ECS) has been developed in this study. Two different types of explants (immature male flowers and multiple buds) were tested for their ability to develop ECS in several varieties of banana locally grown in Africa. ECS of banana varieties "Cavendish Williams" and "Gros Michel" were developed using multiple buds, whereas ECS of "Sukali Ndiizi" was developed using immature male flowers. Regeneration efficiency of ECS was about 20,000-50,000 plantlets per ml of settled cell volume (SCV) depending on variety. ECS of three different varieties were transformed through Agrobacterium-mediated transformation using gusA reporter gene and 20-70 independent transgenic events per ml SCV of ECS were regenerated on selective medium. The presence and integration of gusA gene in transgenic plants was confirmed by PCR, dot blot, and Southern blot analysis and expression by histochemical GUS assays. The robust transformation platform was successfully used to generate hundreds of transgenic lines with disease resistance. Such a platform will facilitate the transfer of technologies to national agricultural research systems (NARS) in Africa.
Tripathi, Jaindra N.; Oduor, Richard O.; Tripathi, Leena
2015-01-01
Banana (Musa spp.) is an important staple food as well as cash crop in tropical and subtropical countries. Various bacterial, fungal, and viral diseases and pests such as nematodes are major constraints in its production and are currently destabilizing the banana production in sub-Saharan Africa. Genetic engineering is a complementary option used for incorporating useful traits in banana to bypass the long generation time, polyploidy, and sterility of most of the cultivated varieties. A robust transformation protocol for farmer preferred varieties is crucial for banana genomics and improvement. A robust and reproducible system for genetic transformation of banana using embryogenic cell suspensions (ECS) has been developed in this study. Two different types of explants (immature male flowers and multiple buds) were tested for their ability to develop ECS in several varieties of banana locally grown in Africa. ECS of banana varieties “Cavendish Williams” and “Gros Michel” were developed using multiple buds, whereas ECS of “Sukali Ndiizi” was developed using immature male flowers. Regeneration efficiency of ECS was about 20,000–50,000 plantlets per ml of settled cell volume (SCV) depending on variety. ECS of three different varieties were transformed through Agrobacterium-mediated transformation using gusA reporter gene and 20–70 independent transgenic events per ml SCV of ECS were regenerated on selective medium. The presence and integration of gusA gene in transgenic plants was confirmed by PCR, dot blot, and Southern blot analysis and expression by histochemical GUS assays. The robust transformation platform was successfully used to generate hundreds of transgenic lines with disease resistance. Such a platform will facilitate the transfer of technologies to national agricultural research systems (NARS) in Africa. PMID:26635849
Genetic transformation protocols using zygotic embryos as explants: an overview.
Tahir, Muhammad; Waraich, Ejaz A; Stasolla, Claudio
2011-01-01
Genetic transformation of plants is an innovative research tool which has practical significance for the development of new and improved genotypes or cultivars. However, stable introduction of genes of interest into nuclear genomes depends on several factors such as the choice of target tissue, the method of DNA delivery in the target tissue, and the appropriate method to select the transformed plants. Mature or immature zygotic embryos have been a popular choice as explant or target tissue for genetic transformation in both angiosperms and gymnosperms. As a result, considerable protocols have emerged in the literature which have been optimized for various plant species in terms of transformation methods and selection procedures for transformed plants. This article summarizes the recent advances in plant transformation using zygotic embryos as explants.
Soybean (Glycine max) transformation using mature cotyledonary node explants.
Olhoft, Paula M; Donovan, Christopher M; Somers, David A
2006-01-01
Agrobacterium tumefaciens-mediated transformation of soybeans has been steadily improved since its development in 1988. Soybean transformation is now possible in a range of genotypes from different maturity groups using different explants as sources of regenerable cells, various selectable marker genes and selective agents, and different A. tumefaciens strains. The cotyledonary-node method has been extensively investigated and across a number of laboratories yields on average greater than 1% transformation efficiency (one Southern-positive, independent event per 100 cotyledonary-node explants). Continued improvements in the cotyledonary-node method concomitant with further increases in transformation efficiency will enhance broader adoption of this already productive transformation method for use in crop improvement and functional genomics research efforts.
Use of the Box-Cox Transformation in Detecting Changepoints in Daily Precipitation Data Series
NASA Astrophysics Data System (ADS)
Wang, X. L.; Chen, H.; Wu, Y.; Pu, Q.
2009-04-01
This study integrates a Box-Cox power transformation procedure into two statistical tests for detecting changepoints in Gaussian data series, to make the changepoint detection methods applicable to non-Gaussian data series, such as daily precipitation amounts. The detection power aspects of transformed methods in a common trend two-phase regression setting are assessed by Monte Carlo simulations for data of a log-normal or Gamma distribution. The results show that the transformed methods have increased the power of detection, in comparison with the corresponding original (untransformed) methods. The transformed data much better approximate to a Gaussian distribution. As an example of application, the new methods are applied to a series of daily precipitation amounts recorded at a station in Canada, showing satisfactory detection power.
NASA Astrophysics Data System (ADS)
Ludanov, K. I.
The author proposes a new method for the transformation of solar radiation energy into electric power, which is alternative for photo-transformation. Ukrpatents's positive decisions are obtained for the method and for the installation for its realization. The method includes two phases: concentration of solar radiation by paraboloid mirrors with high potential heat obtaining in the helio receiver and the next heat transformation into electric power in the framework of the thermal cycle "high temperature electrolytic steam decomposition on the components (H2 and O2) + electrochemical generation by the way of the water recombination from H2 and O2 in the low temperature fuel cell". The new method gives the double superiority in comparison with the photo-transformation.
Analysis of temperature rise for piezoelectric transformer using finite-element method.
Joo, Hyun-Woo; Lee, Chang-Hwan; Rho, Jong-Seok; Jung, Hyun-Kyo
2006-08-01
Analysis of heat problem and temperature field of a piezoelectric transformer, operated at steady-state conditions, is described. The resonance frequency of the transformer is calculated from impedance and electrical gain analysis using a finite-element method. Mechanical displacement and electric potential of the transformer at the calculated resonance frequency are used to calculate the loss distribution of the transformer. Temperature distribution using discretized heat transfer equation is calculated from the obtained losses of the transformer. Properties of the piezoelectric material, dependent on the temperature field, are measured to recalculate the losses, temperature distribution, and new resonance characteristics of the transformer. Iterative method is adopted to recalculate the losses and resonance frequency due to the changes of the material constants from temperature increase. Computed temperature distributions and new resonance characteristics of the transformer at steady-state temperature are verified by comparison with experimental results.
Gómez Román, Victor Raúl; Vinner, Lasse; Grevstad, Berit; Hansen, Jesper Juhl; Wegmann, Frank; Spetz, Anna-Lena; Fomsgaard, Anders
2010-12-15
The New Zealand white rabbit model (Oryctolagus cuniculus) is widely used to test whether HIV vaccine candidates elicit systemic antibody responses; however, its use in mucosal immunology has not been fully exploited due to the difficulty in collecting mucosal specimens longitudinally and reproducibly. Here we describe feasible and non-feasible methods to collect vaginal and nasal specimens from nulliparous rabbits. Non-feasible methods were those resulting in poor reproducibility and considerable animal twitching during sampling, whereas feasible methods resulted in no animal twitching and potential for sampling reproducibility. Standard operating procedures (SOPs) were implemented to collect vaginal swabs yielding total IgA titres ranging from 12,500 to 312,500. Intranasal immunisation with a naked DNA vaccine encoding HIV gp140 elicited HIV envelope-specific IgA detectable in nasal but not in vaginal secretions. Our methods provide an alternative to reliably assess pre- and post-vaccination mucosal antibody titres longitudinally in rabbits as part of mucosal HIV vaccine immunogenicity studies. Copyright © 2010 Elsevier B.V. All rights reserved.
Fast heap transform-based QR-decomposition of real and complex matrices: algorithms and codes
NASA Astrophysics Data System (ADS)
Grigoryan, Artyom M.
2015-03-01
In this paper, we describe a new look on the application of Givens rotations to the QR-decomposition problem, which is similar to the method of Householder transformations. We apply the concept of the discrete heap transform, or signal-induced unitary transforms which had been introduced by Grigoryan (2006) and used in signal and image processing. Both cases of real and complex nonsingular matrices are considered and examples of performing QR-decomposition of square matrices are given. The proposed method of QR-decomposition for the complex matrix is novel and differs from the known method of complex Givens rotation and is based on analytical equations for the heap transforms. Many examples illustrated the proposed heap transform method of QR-decomposition are given, algorithms are described in detail, and MATLAB-based codes are included.
Zhang, Fang; Zhu, Jing; Song, Qiang; Yue, Weirui; Liu, Jingdan; Wang, Jian; Situ, Guohai; Huang, Huijie
2015-10-20
In general, Fourier transform lenses are considered as ideal in the design algorithms of diffractive optical elements (DOEs). However, the inherent aberrations of a real Fourier transform lens disturb the far field pattern. The difference between the generated pattern and the expected design will impact the system performance. Therefore, a method for modifying the Fourier spectrum of DOEs without introducing other optical elements to reduce the aberration effect of the Fourier transform lens is proposed. By applying this method, beam shaping performance is improved markedly for the optical system with a real Fourier transform lens. The experiments carried out with a commercial Fourier transform lens give evidence for this method. The method is capable of reducing the system complexity as well as improving its performance.
Sweet Potato [Ipomoea batatas (L.) Lam].
Song, Guo-qing; Yamaguchi, Ken-ichi
2006-01-01
Among the available transformation methods reported on sweet potato, Agrobacterium tumefaciens-mediated transformation is more successful and desirable. Stem explants have shown to be ideal for the transformation of sweet potato because of their ready availability as explants, the simple transformation process, and high-frequency-regeneration via somatic embryogenesis. Under the two-step kanamycin-hygromycin selection method and using the appropriate explants type (stem explants), the efficiency of transformation can be considerably improved in cv. Beniazuma. The high efficiency in the transformation of stem explants suggests that the transformation protocol described in this chapter warrants testing for routine stable transformation of diverse varieties of sweet potato.
A prototype stationary Fourier transform spectrometer for near-infrared absorption spectroscopy.
Li, Jinyang; Lu, Dan-feng; Qi, Zhi-mei
2015-09-01
A prototype stationary Fourier transform spectrometer (FTS) was constructed with a fiber-coupled lithium niobate (LiNbO3) waveguide Mach-Zehnder interferometer (MZI) for the purpose of rapid on-site spectroscopy of biological and chemical measurands. The MZI contains push-pull electrodes for electro-optic modulation, and its interferogram as a plot of intensity against voltage was obtained by scanning the modulating voltage from -60 to +60 V in 50 ms. The power spectrum of input signal was retrieved by Fourier transform processing of the interferogram combined with the wavelength dispersion of half-wave voltage determined for the MZI used. The prototype FTS operates in the single-mode wavelength range from 1200 to 1700 nm and allows for reproducible spectroscopy. A linear concentration dependence of the absorbance at λmax = 1451 nm for water in ethanolic solution was obtained using the prototype FTS. The near-infrared spectroscopy of solid samples was also implemented, and the different spectra obtained with different materials evidenced the chemical recognition capability of the prototype FTS. To make this prototype FTS practically applicable, work on improving its spectral resolution by increasing the maximum optical path length difference is in progress.
Restaino, Odile Francesca; Marseglia, Mariacarmela; De Castro, Cristina; Diana, Paola; Forni, Pasquale; Parrilli, Michelangelo; De Rosa, Mario; Schiraldi, Chiara
2014-02-01
Streptomyces roseochromogenes is able to hydroxylate steroid compounds in different positions of their cycloalkane rings thanks to a cytochrome P-450 multi-enzyme complex. In this paper, the hydroxylation of the hydrocortisone in the 16α position, performed by bacterial whole cells, was investigated in both shake flask and fermentation conditions; the best settings for both cellular growth and transformation reaction were studied by investigating the optimal medium composition, the kinetic of conversion, the most suitable substrate concentration and the preferred addition timing. Using newly formulated malt extract- and yeast extract-based media, a 16α-hydrohydrocortisone concentration of 0.2 ± 0.01 g L(-1) was reached in shake flasks. Batch experiments in a 2-L fermentor established the reproducibility and robustness of the biotransformation, while a pulsed batch fermentation strategy allowed the production to increase up to 0.508 ± 0.01 g L(-1). By-product formation was investigated, and two new derivates of the hydrocortisone obtained during the bacterial transformation reaction and unknown so far, a C-20 hydroxy derivate and a C-21 N-acetamide one, were determined by NMR analyses.
Bedrov, Dmitry; Hooper, Justin B; Smith, Grant D; Sewell, Thomas D
2009-07-21
Molecular dynamics (MD) simulations of uniaxial shock compression along the [100] and [001] directions in the alpha polymorph of hexahydro-1,3,5-trinitro-1,3,5-triazine (alpha-RDX) have been conducted over a wide range of shock pressures using the uniaxial constant stress Hugoniostat method [Ravelo et al., Phys. Rev. B 70, 014103 (2004)]. We demonstrate that the Hugoniostat method is suitable for studying shock compression in atomic-scale models of energetic materials without the necessity to consider the extremely large simulation cells required for an explicit shock wave simulation. Specifically, direct comparison of results obtained using the Hugoniostat approach to those reported by Thompson and co-workers [Phys. Rev. B 78, 014107 (2008)] based on large-scale MD simulations of shocks using the shock front absorbing boundary condition (SFABC) approach indicates that Hugoniostat simulations of systems containing several thousand molecules reproduced the salient features observed in the SFABC simulations involving roughly a quarter-million molecules, namely, nucleation and growth of nanoscale shear bands for shocks propagating along the [100] direction and the polymorphic alpha-gamma phase transition for shocks directed along the [001] direction. The Hugoniostat simulations yielded predictions of the Hugoniot elastic limit for the [100] shock direction consistent with SFABC simulation results.
Ionescu, Robert; Campbell, Brennan; Wu, Ryan; Aytan, Ece; Patalano, Andrew; Ruiz, Isaac; Howell, Stephen W; McDonald, Anthony E; Beechem, Thomas E; Mkhoyan, K Andre; Ozkan, Mihrimah; Ozkan, Cengiz S
2017-07-25
It is of paramount importance to improve the control over large area growth of high quality molybdenum disulfide (MoS 2 ) and other types of 2D dichalcogenides. Such atomically thin materials have great potential for use in electronics, and are thought to make possible the first real applications of spintronics. Here in, a facile and reproducible method of producing wafer scale atomically thin MoS 2 layers has been developed using the incorporation of a chelating agent in a common organic solvent, dimethyl sulfoxide (DMSO). Previously, solution processing of a MoS 2 precursor, ammonium tetrathiomolybdate ((NH 4 ) 2 MoS 4 ), and subsequent thermolysis was used to produce large area MoS 2 layers. Our work here shows that the use of ethylenediaminetetraacetic acid (EDTA) in DMSO exerts superior control over wafer coverage and film thickness, and the results demonstrate that the chelating action and dispersing effect of EDTA is critical in growing uniform films. Raman spectroscopy, photoluminescence (PL), x-ray photoelectron spectroscopy (XPS), Fourier transform infrared spectroscopy (FTIR), atomic force microscopy (AFM) and high-resolution scanning transmission electron microscopy (HR-STEM) indicate the formation of homogenous few layer MoS 2 films at the wafer scale, resulting from the novel chelant-in-solution method.
A Bayesian method for detecting pairwise associations in compositional data
Ventz, Steffen; Huttenhower, Curtis
2017-01-01
Compositional data consist of vectors of proportions normalized to a constant sum from a basis of unobserved counts. The sum constraint makes inference on correlations between unconstrained features challenging due to the information loss from normalization. However, such correlations are of long-standing interest in fields including ecology. We propose a novel Bayesian framework (BAnOCC: Bayesian Analysis of Compositional Covariance) to estimate a sparse precision matrix through a LASSO prior. The resulting posterior, generated by MCMC sampling, allows uncertainty quantification of any function of the precision matrix, including the correlation matrix. We also use a first-order Taylor expansion to approximate the transformation from the unobserved counts to the composition in order to investigate what characteristics of the unobserved counts can make the correlations more or less difficult to infer. On simulated datasets, we show that BAnOCC infers the true network as well as previous methods while offering the advantage of posterior inference. Larger and more realistic simulated datasets further showed that BAnOCC performs well as measured by type I and type II error rates. Finally, we apply BAnOCC to a microbial ecology dataset from the Human Microbiome Project, which in addition to reproducing established ecological results revealed unique, competition-based roles for Proteobacteria in multiple distinct habitats. PMID:29140991
Diagnosis of skin cancer by correlation and complexity analyses of damaged DNA
Namazi, Hamidreza; Kulish, Vladimir V.; Delaviz, Fatemeh; Delaviz, Ali
2015-01-01
Skin cancer is a common, low-grade cancerous (malignant) growth of the skin. It starts from cells that begin as normal skin cells and transform into those with the potential to reproduce in an out-of-control manner. Cancer develops when DNA, the molecule found in cells that encodes genetic information, becomes damaged and the body cannot repair the damage. A DNA walk of a genome represents how the frequency of each nucleotide of a pairing nucleotide couple changes locally. In this research in order to diagnose the skin cancer, first DNA walk plots of genomes of patients with skin cancer were generated. Then, the data so obtained was checked for complexity by computing the fractal dimension. Furthermore, the Hurst exponent has been employed in order to study the correlation of damaged DNA. By analysing different samples it has been found that the damaged DNA sequences are exhibiting higher degree of complexity and less correlation compared to normal DNA sequences. This investigation confirms that this method can be used for diagnosis of skin cancer. The method discussed in this research is useful not only for diagnosis of skin cancer but can be applied for diagnosis and growth analysis of different types of cancers. PMID:26497203
NASA Astrophysics Data System (ADS)
Schneider, Peter; Sluse, Dominique
2013-11-01
The light travel time differences in strong gravitational lensing systems allows an independent determination of the Hubble constant. This method has been successfully applied to several lens systems. The formally most precise measurements are, however, in tension with the recent determination of H0 from the Planck satellite for a spatially flat six-parameters ΛCDM cosmology. We reconsider the uncertainties of the method, concerning the mass profile of the lens galaxies, and show that the formal precision relies on the assumption that the mass profile is a perfect power law. Simple analytical arguments and numerical experiments reveal that mass-sheet like transformations yield significant freedom in choosing the mass profile, even when exquisite Einstein rings are observed. Furthermore, the characterization of the environment of the lens does not break that degeneracy which is not physically linked to extrinsic convergence. We present an illustrative example where the multiple imaging properties of a composite (baryons + dark matter) lens can be extremely well reproduced by a power-law model having the same velocity dispersion, but with predictions for the Hubble constant that deviate by ~20%. Hence we conclude that the impact of degeneracies between parametrized models have been underestimated in current H0 measurements from lensing, and need to be carefully reconsidered.
[Comparison of in vitro model examinaitons with respect to drug release from suppositories].
Regdon, G; Vágó, I; Mándi, E; Regdon, G; Erós, I
2000-04-01
9 lipophilic suppository bases with different physical-chemical parameters were examined. Buspiron-hydrochloride, an anxiolytic drug with good water-solubility was used--partly as a model--as a pharmacon, in a concentration of 10.0 mg/2.00 g. The rate and extent of in vitro drug release was monitored with static and dynamic methods. Kidney-dialysing membranes with various surfaces were used. The quantitative measurements were carried out spectrophotometrically and the amount of the diffused drug was determined at lambda = 298 nm. The mean values were calculated from 5 parallel measurements each time. The percentage values of in vitro relative availability revealed that the results of the two static diffusion studies did not differ significantly (p < 0.05) and were almost independent of the size of the membrane surface. The results of the dynamic diffusion method were well-reproducible but were vehicle-dependent. The process of release was characterized by the mathematical transformation of the release curves, while the correlation coefficients described the closeness of the relation. Two German vehicles, namely Witepsol H 15 with a medium hydroxyl value and Massa Estarinum 299, and a French vehicle, Suppocire AS2X were found to be excellent for the formulation of suppositories containing Buspiron-hydrochloride.