A new method to identify the foot of continental slope based on an integrated profile analysis
NASA Astrophysics Data System (ADS)
Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin
2017-06-01
A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.
USDA-ARS?s Scientific Manuscript database
Analysis of DNA methylation patterns relies increasingly on sequencing-based profiling methods. The four most frequently used sequencing-based technologies are the bisulfite-based methods MethylC-seq and reduced representation bisulfite sequencing (RRBS), and the enrichment-based techniques methylat...
Use of focused ultrasonication in activity-based profiling of deubiquitinating enzymes in tissue.
Nanduri, Bindu; Shack, Leslie A; Rai, Aswathy N; Epperson, William B; Baumgartner, Wes; Schmidt, Ty B; Edelmann, Mariola J
2016-12-15
To develop a reproducible tissue lysis method that retains enzyme function for activity-based protein profiling, we compared four different methods to obtain protein extracts from bovine lung tissue: focused ultrasonication, standard sonication, mortar & pestle method, and homogenization combined with standard sonication. Focused ultrasonication and mortar & pestle methods were sufficiently effective for activity-based profiling of deubiquitinases in tissue, and focused ultrasonication also had the fastest processing time. We used focused-ultrasonicator for subsequent activity-based proteomic analysis of deubiquitinases to test the compatibility of this method in sample preparation for activity-based chemical proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.
Brodsky, Leonid; Leontovich, Andrei; Shtutman, Michael; Feinstein, Elena
2004-01-01
Mathematical methods of analysis of microarray hybridizations deal with gene expression profiles as elementary units. However, some of these profiles do not reflect a biologically relevant transcriptional response, but rather stem from technical artifacts. Here, we describe two technically independent but rationally interconnected methods for identification of such artifactual profiles. Our diagnostics are based on detection of deviations from uniformity, which is assumed as the main underlying principle of microarray design. Method 1 is based on detection of non-uniformity of microarray distribution of printed genes that are clustered based on the similarity of their expression profiles. Method 2 is based on evaluation of the presence of gene-specific microarray spots within the slides’ areas characterized by an abnormal concentration of low/high differential expression values, which we define as ‘patterns of differentials’. Applying two novel algorithms, for nested clustering (method 1) and for pattern detection (method 2), we can make a dual estimation of the profile’s quality for almost every printed gene. Genes with artifactual profiles detected by method 1 may then be removed from further analysis. Suspicious differential expression values detected by method 2 may be either removed or weighted according to the probabilities of patterns that cover them, thus diminishing their input in any further data analysis. PMID:14999086
Better Than Counting: Density Profiles from Force Sampling
NASA Astrophysics Data System (ADS)
de las Heras, Daniel; Schmidt, Matthias
2018-05-01
Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.
NASA Astrophysics Data System (ADS)
Costa-Surós, M.; Calbó, J.; González, J. A.; Long, C. N.
2013-06-01
The cloud vertical distribution and especially the cloud base height, which is linked to cloud type, is an important characteristic in order to describe the impact of clouds in a changing climate. In this work several methods to estimate the cloud vertical structure (CVS) based on atmospheric sounding profiles are compared, considering number and position of cloud layers, with a ground based system which is taken as a reference: the Active Remote Sensing of Clouds (ARSCL). All methods establish some conditions on the relative humidity, and differ on the use of other variables, the thresholds applied, or the vertical resolution of the profile. In this study these methods are applied to 125 radiosonde profiles acquired at the ARM Southern Great Plains site during all seasons of year 2009 and endorsed by GOES images, to confirm that the cloudiness conditions are homogeneous enough across their trajectory. The overall agreement for the methods ranges between 44-88%; four methods produce total agreements around 85%. Further tests and improvements are applied on one of these methods. In addition, we attempt to make this method suitable for low resolution vertical profiles, which could be useful in atmospheric modeling. The total agreement, even when using low resolution profiles, can be improved up to 91% if the thresholds for a moist layer to become a cloud layer are modified to minimize false negatives with the current data set, thus improving overall agreement.
NASA Astrophysics Data System (ADS)
Costa-Surós, M.; Calbó, J.; González, J. A.; Long, C. N.
2014-08-01
The cloud vertical distribution and especially the cloud base height, which is linked to cloud type, are important characteristics in order to describe the impact of clouds on climate. In this work, several methods for estimating the cloud vertical structure (CVS) based on atmospheric sounding profiles are compared, considering the number and position of cloud layers, with a ground-based system that is taken as a reference: the Active Remote Sensing of Clouds (ARSCL). All methods establish some conditions on the relative humidity, and differ in the use of other variables, the thresholds applied, or the vertical resolution of the profile. In this study, these methods are applied to 193 radiosonde profiles acquired at the Atmospheric Radiation Measurement (ARM) Southern Great Plains site during all seasons of the year 2009 and endorsed by Geostationary Operational Environmental Satellite (GOES) images, to confirm that the cloudiness conditions are homogeneous enough across their trajectory. The perfect agreement (i.e., when the whole CVS is estimated correctly) for the methods ranges between 26 and 64%; the methods show additional approximate agreement (i.e., when at least one cloud layer is assessed correctly) from 15 to 41%. Further tests and improvements are applied to one of these methods. In addition, we attempt to make this method suitable for low-resolution vertical profiles, like those from the outputs of reanalysis methods or from the World Meteorological Organization's (WMO) Global Telecommunication System. The perfect agreement, even when using low-resolution profiles, can be improved by up to 67% (plus 25% of the approximate agreement) if the thresholds for a moist layer to become a cloud layer are modified to minimize false negatives with the current data set, thus improving overall agreement.
Retrieving the aerosol lidar ratio profile by combining ground- and space-based elastic lidars.
Feiyue, Mao; Wei, Gong; Yingying, Ma
2012-02-15
The aerosol lidar ratio is a key parameter for the retrieval of aerosol optical properties from elastic lidar, which changes largely for aerosols with different chemical and physical properties. We proposed a method for retrieving the aerosol lidar ratio profile by combining simultaneous ground- and space-based elastic lidars. The method was tested by a simulated case and a real case at 532 nm wavelength. The results demonstrated that our method is robust and can obtain accurate lidar ratio and extinction coefficient profiles. Our method can be useful for determining the local and global lidar ratio and validating space-based lidar datasets.
A ranking method for the concurrent learning of compounds with various activity profiles.
Dörr, Alexander; Rosenbaum, Lars; Zell, Andreas
2015-01-01
In this study, we present a SVM-based ranking algorithm for the concurrent learning of compounds with different activity profiles and their varying prioritization. To this end, a specific labeling of each compound was elaborated in order to infer virtual screening models against multiple targets. We compared the method with several state-of-the-art SVM classification techniques that are capable of inferring multi-target screening models on three chemical data sets (cytochrome P450s, dehydrogenases, and a trypsin-like protease data set) containing three different biological targets each. The experiments show that ranking-based algorithms show an increased performance for single- and multi-target virtual screening. Moreover, compounds that do not completely fulfill the desired activity profile are still ranked higher than decoys or compounds with an entirely undesired profile, compared to other multi-target SVM methods. SVM-based ranking methods constitute a valuable approach for virtual screening in multi-target drug design. The utilization of such methods is most helpful when dealing with compounds with various activity profiles and the finding of many ligands with an already perfectly matching activity profile is not to be expected.
Harris, R. Alan; Wang, Ting; Coarfa, Cristian; Nagarajan, Raman P.; Hong, Chibo; Downey, Sara L.; Johnson, Brett E.; Fouse, Shaun D.; Delaney, Allen; Zhao, Yongjun; Olshen, Adam; Ballinger, Tracy; Zhou, Xin; Forsberg, Kevin J.; Gu, Junchen; Echipare, Lorigail; O’Geen, Henriette; Lister, Ryan; Pelizzola, Mattia; Xi, Yuanxin; Epstein, Charles B.; Bernstein, Bradley E.; Hawkins, R. David; Ren, Bing; Chung, Wen-Yu; Gu, Hongcang; Bock, Christoph; Gnirke, Andreas; Zhang, Michael Q.; Haussler, David; Ecker, Joseph; Li, Wei; Farnham, Peggy J.; Waterland, Robert A.; Meissner, Alexander; Marra, Marco A.; Hirst, Martin; Milosavljevic, Aleksandar; Costello, Joseph F.
2010-01-01
Sequencing-based DNA methylation profiling methods are comprehensive and, as accuracy and affordability improve, will increasingly supplant microarrays for genome-scale analyses. Here, four sequencing-based methodologies were applied to biological replicates of human embryonic stem cells to compare their CpG coverage genome-wide and in transposons, resolution, cost, concordance and its relationship with CpG density and genomic context. The two bisulfite methods reached concordance of 82% for CpG methylation levels and 99% for non-CpG cytosine methylation levels. Using binary methylation calls, two enrichment methods were 99% concordant, while regions assessed by all four methods were 97% concordant. To achieve comprehensive methylome coverage while reducing cost, an approach integrating two complementary methods was examined. The integrative methylome profile along with histone methylation, RNA, and SNP profiles derived from the sequence reads allowed genome-wide assessment of allele-specific epigenetic states, identifying most known imprinted regions and new loci with monoallelic epigenetic marks and monoallelic expression. PMID:20852635
Zhu, Jie; Qin, Yufang; Liu, Taigang; Wang, Jun; Zheng, Xiaoqi
2013-01-01
Identification of gene-phenotype relationships is a fundamental challenge in human health clinic. Based on the observation that genes causing the same or similar phenotypes tend to correlate with each other in the protein-protein interaction network, a lot of network-based approaches were proposed based on different underlying models. A recent comparative study showed that diffusion-based methods achieve the state-of-the-art predictive performance. In this paper, a new diffusion-based method was proposed to prioritize candidate disease genes. Diffusion profile of a disease was defined as the stationary distribution of candidate genes given a random walk with restart where similarities between phenotypes are incorporated. Then, candidate disease genes are prioritized by comparing their diffusion profiles with that of the disease. Finally, the effectiveness of our method was demonstrated through the leave-one-out cross-validation against control genes from artificial linkage intervals and randomly chosen genes. Comparative study showed that our method achieves improved performance compared to some classical diffusion-based methods. To further illustrate our method, we used our algorithm to predict new causing genes of 16 multifactorial diseases including Prostate cancer and Alzheimer's disease, and the top predictions were in good consistent with literature reports. Our study indicates that integration of multiple information sources, especially the phenotype similarity profile data, and introduction of global similarity measure between disease and gene diffusion profiles are helpful for prioritizing candidate disease genes. Programs and data are available upon request.
Effects of random tooth profile errors on the dynamic behaviors of planetary gears
NASA Astrophysics Data System (ADS)
Xun, Chao; Long, Xinhua; Hua, Hongxing
2018-02-01
In this paper, a nonlinear random model is built to describe the dynamics of planetary gear trains (PGTs), in which the time-varying mesh stiffness, tooth profile modification (TPM), tooth contact loss, and random tooth profile error are considered. A stochastic method based on the method of multiple scales (MMS) is extended to analyze the statistical property of the dynamic performance of PGTs. By the proposed multiple-scales based stochastic method, the distributions of the dynamic transmission errors (DTEs) are investigated, and the lower and upper bounds are determined based on the 3σ principle. Monte Carlo method is employed to verify the proposed method. Results indicate that the proposed method can be used to determine the distribution of the DTE of PGTs high efficiently and allow a link between the manufacturing precision and the dynamical response. In addition, the effects of tooth profile modification on the distributions of vibration amplitudes and the probability of tooth contact loss with different manufacturing tooth profile errors are studied. The results show that the manufacturing precision affects the distribution of dynamic transmission errors dramatically and appropriate TPMs are helpful to decrease the nominal value and the deviation of the vibration amplitudes.
NASA Astrophysics Data System (ADS)
Costa-Surós, M.; Calbó, J.; González, J. A.; Long, C. N.
2014-04-01
The cloud vertical distribution and especially the cloud base height, which is linked to cloud type, is an important characteristic in order to describe the impact of clouds on climate. In this work several methods to estimate the cloud vertical structure (CVS) based on atmospheric sounding profiles are compared, considering number and position of cloud layers, with a ground based system which is taken as a reference: the Active Remote Sensing of Clouds (ARSCL). All methods establish some conditions on the relative humidity, and differ on the use of other variables, the thresholds applied, or the vertical resolution of the profile. In this study these methods are applied to 193 radiosonde profiles acquired at the ARM Southern Great Plains site during all seasons of year 2009 and endorsed by GOES images, to confirm that the cloudiness conditions are homogeneous enough across their trajectory. The perfect agreement (i.e. when the whole CVS is correctly estimated) for the methods ranges between 26-64%; the methods show additional approximate agreement (i.e. when at least one cloud layer is correctly assessed) from 15-41%. Further tests and improvements are applied on one of these methods. In addition, we attempt to make this method suitable for low resolution vertical profiles, like those from the outputs of reanalysis methods or from the WMO's Global Telecommunication System. The perfect agreement, even when using low resolution profiles, can be improved up to 67% (plus 25% of approximate agreement) if the thresholds for a moist layer to become a cloud layer are modified to minimize false negatives with the current data set, thus improving overall agreement.
Activity-based protein profiling for biochemical pathway discovery in cancer
Nomura, Daniel K.; Dix, Melissa M.; Cravatt, Benjamin F.
2011-01-01
Large-scale profiling methods have uncovered numerous gene and protein expression changes that correlate with tumorigenesis. However, determining the relevance of these expression changes and which biochemical pathways they affect has been hindered by our incomplete understanding of the proteome and its myriad functions and modes of regulation. Activity-based profiling platforms enable both the discovery of cancer-relevant enzymes and selective pharmacological probes to perturb and characterize these proteins in tumour cells. When integrated with other large-scale profiling methods, activity-based proteomics can provide insight into the metabolic and signalling pathways that support cancer pathogenesis and illuminate new strategies for disease diagnosis and treatment. PMID:20703252
Re-evaluating microglia expression profiles using RiboTag and cell isolation strategies.
Haimon, Zhana; Volaski, Alon; Orthgiess, Johannes; Boura-Halfon, Sigalit; Varol, Diana; Shemer, Anat; Yona, Simon; Zuckerman, Binyamin; David, Eyal; Chappell-Maor, Louise; Bechmann, Ingo; Gericke, Martin; Ulitsky, Igor; Jung, Steffen
2018-06-01
Transcriptome profiling is widely used to infer functional states of specific cell types, as well as their responses to stimuli, to define contributions to physiology and pathophysiology. Focusing on microglia, the brain's macrophages, we report here a side-by-side comparison of classical cell-sorting-based transcriptome sequencing and the 'RiboTag' method, which avoids cell retrieval from tissue context and yields translatome sequencing information. Conventional whole-cell microglial transcriptomes were found to be significantly tainted by artifacts introduced by tissue dissociation, cargo contamination and transcripts sequestered from ribosomes. Conversely, our data highlight the added value of RiboTag profiling for assessing the lineage accuracy of Cre recombinase expression in transgenic mice. Collectively, this study indicates method-based biases, reveals observer effects and establishes RiboTag-based translatome profiling as a valuable complement to standard sorting-based profiling strategies.
NASA Astrophysics Data System (ADS)
Labzovskii, Lev D.; Papayannis, Alexandros; Binietoglou, Ioannis; Banks, Robert F.; Baldasano, Jose M.; Toanca, Florica; Tzanis, Chris G.; Christodoulakis, John
2018-02-01
Accurate continuous measurements of relative humidity (RH) vertical profiles in the lower troposphere have become a significant scientific challenge. In recent years a synergy of various ground-based remote sensing instruments have been successfully used for RH vertical profiling, which has resulted in the improvement of spatial resolution and, in some cases, of the accuracy of the measurement. Some studies have also suggested the use of high-resolution model simulations as input datasets into RH vertical profiling techniques. In this paper we apply two synergetic methods for RH profiling, including the synergy of lidar with a microwave radiometer and high-resolution atmospheric modeling. The two methods are employed for RH retrieval between 100 and 6000 m with increased spatial resolution, based on datasets from the HygrA-CD (Hygroscopic Aerosols to Cloud Droplets) campaign conducted in Athens, Greece from May to June 2014. RH profiles from synergetic methods are then compared with those retrieved using single instruments or as simulated by high-resolution models. Our proposed technique for RH profiling provides improved statistical agreement with reference to radiosoundings by 27 % when the lidar-radiometer (in comparison with radiometer measurements) approach is used and by 15 % when a lidar model is used (in comparison with WRF-model simulations). Mean uncertainty of RH due to temperature bias in RH profiling was ˜ 4.34 % for the lidar-radiometer and ˜ 1.22 % for the lidar-model methods. However, maximum uncertainty in RH retrievals due to temperature bias showed that lidar-model method is more reliable at heights greater than 2000 m. Overall, our results have demonstrated the capability of both combined methods for daytime measurements in heights between 100 and 6000 m when lidar-radiometer or lidar-WRF combined datasets are available.
NASA Technical Reports Server (NTRS)
Fennelly, J. A.; Torr, D. G.; Richards, P. G.; Torr, M. R.; Sharp, W. E.
1991-01-01
This paper describes a technique for extracting thermospheric profiles of the atomic-oxygen density and temperature, using ground-based measurements of the O(+)(2D-2P) doublet at 7320 and 7330 A in the twilight airglow. In this method, a local photochemical model is used to calculate the 7320-A intensity; the method also utilizes an iterative inversion procedure based on the Levenberg-Marquardt method described by Press et al. (1986). The results demonstrate that, if the measurements are only limited by errors due to Poisson noise, the altitude profiles of neutral temperature and atomic oxygen concentration can be determined accurately using currently available spectrometers.
NASA Astrophysics Data System (ADS)
Ji, Hongzhu; Zhang, Yinchao; Chen, Siying; Chen, He; Guo, Pan
2018-06-01
An iterative method, based on a derived inverse relationship between atmospheric backscatter coefficient and aerosol lidar ratio, is proposed to invert the lidar ratio profile and aerosol extinction coefficient. The feasibility of this method is investigated theoretically and experimentally. Simulation results show the inversion accuracy of aerosol optical properties for iterative method can be improved in the near-surface aerosol layer and the optical thick layer. Experimentally, as a result of the reduced insufficiency error and incoherence error, the aerosol optical properties with higher accuracy can be obtained in the near-surface region and the region of numerical derivative distortion. In addition, the particle component can be distinguished roughly based on this improved lidar ratio profile.
Deviation rectification for dynamic measurement of rail wear based on coordinate sets projection
NASA Astrophysics Data System (ADS)
Wang, Chao; Ma, Ziji; Li, Yanfu; Zeng, Jiuzhen; Jin, Tan; Liu, Hongli
2017-10-01
Dynamic measurement of rail wear using a laser imaging system suffers from random vibrations in the laser-based imaging sensor which cause distorted rail profiles. In this paper, a simple and effective method for rectifying profile deviation is presented to address this issue. There are two main steps: profile recognition and distortion calibration. According to the constant camera and projector parameters, efficient recognition of measured profiles is achieved by analyzing the geometric difference between normal profiles and distorted ones. For a distorted profile, by constructing coordinate sets projecting from it to the standard one on triple projecting primitives, including the rail head inner line, rail waist curve and rail jaw, iterative extrinsic camera parameter self-compensation is implemented. The distortion is calibrated by projecting the distorted profile onto the x-y plane of a measuring coordinate frame, which is parallel to the rail cross section, to eliminate the influence of random vibrations in the laser-based imaging sensor. As well as evaluating the implementation with comprehensive experiments, we also compare our method with other published works. The results exhibit the effectiveness and superiority of our method for the dynamic measurement of rail wear.
Weernink, Marieke G M; Groothuis-Oudshoorn, Catharina G M; IJzerman, Maarten J; van Til, Janine A
2016-01-01
The objective of this study was to compare treatment profiles including both health outcomes and process characteristics in Parkinson disease using best-worst scaling (BWS), time trade-off (TTO), and visual analogue scales (VAS). From the model comprising of seven attributes with three levels, six unique profiles were selected representing process-related factors and health outcomes in Parkinson disease. A Web-based survey (N = 613) was conducted in a general population to estimate process-related utilities using profile-based BWS (case 2), multiprofile-based BWS (case 3), TTO, and VAS. The rank order of the six profiles was compared, convergent validity among methods was assessed, and individual analysis focused on the differentiation between pairs of profiles with methods used. The aggregated health-state utilities for the six treatment profiles were highly comparable for all methods and no rank reversals were identified. On the individual level, the convergent validity between all methods was strong; however, respondents differentiated less in the utility of closely related treatment profiles with a VAS or TTO than with BWS. For TTO and VAS, this resulted in nonsignificant differences in mean utilities for closely related treatment profiles. This study suggests that all methods are equally able to measure process-related utility when the aim is to estimate the overall value of treatments. On an individual level, such as in shared decision making, BWS allows for better prioritization of treatment alternatives, especially if they are closely related. The decision-making problem and the need for explicit trade-off between attributes should determine the choice for a method. Copyright © 2016. Published by Elsevier Inc.
Evaluation of parameters of color profile models of LCD and LED screens
NASA Astrophysics Data System (ADS)
Zharinov, I. O.; Zharinov, O. O.
2017-12-01
The purpose of the research relates to the problem of parametric identification of the color profile model of LCD (liquid crystal display) and LED (light emitting diode) screens. The color profile model of a screen is based on the Grassmann’s Law of additive color mixture. Mathematically the problem is to evaluate unknown parameters (numerical coefficients) of the matrix transformation between different color spaces. Several methods of evaluation of these screen profile coefficients were developed. These methods are based either on processing of some colorimetric measurements or on processing of technical documentation data.
NASA Astrophysics Data System (ADS)
Barraclough, Brendan; Li, Jonathan G.; Lebron, Sharon; Fan, Qiyong; Liu, Chihray; Yan, Guanghua
2015-08-01
The ionization chamber volume averaging effect is a well-known issue without an elegant solution. The purpose of this study is to propose a novel convolution-based approach to address the volume averaging effect in model-based treatment planning systems (TPSs). Ionization chamber-measured beam profiles can be regarded as the convolution between the detector response function and the implicit real profiles. Existing approaches address the issue by trying to remove the volume averaging effect from the measurement. In contrast, our proposed method imports the measured profiles directly into the TPS and addresses the problem by reoptimizing pertinent parameters of the TPS beam model. In the iterative beam modeling process, the TPS-calculated beam profiles are convolved with the same detector response function. Beam model parameters responsible for the penumbra are optimized to drive the convolved profiles to match the measured profiles. Since the convolved and the measured profiles are subject to identical volume averaging effect, the calculated profiles match the real profiles when the optimization converges. The method was applied to reoptimize a CC13 beam model commissioned with profiles measured with a standard ionization chamber (Scanditronix Wellhofer, Bartlett, TN). The reoptimized beam model was validated by comparing the TPS-calculated profiles with diode-measured profiles. Its performance in intensity-modulated radiation therapy (IMRT) quality assurance (QA) for ten head-and-neck patients was compared with the CC13 beam model and a clinical beam model (manually optimized, clinically proven) using standard Gamma comparisons. The beam profiles calculated with the reoptimized beam model showed excellent agreement with diode measurement at all measured geometries. Performance of the reoptimized beam model was comparable with that of the clinical beam model in IMRT QA. The average passing rates using the reoptimized beam model increased substantially from 92.1% to 99.3% with 3%/3 mm and from 79.2% to 95.2% with 2%/2 mm when compared with the CC13 beam model. These results show the effectiveness of the proposed method. Less inter-user variability can be expected of the final beam model. It is also found that the method can be easily integrated into model-based TPS.
Stanislawski, Jerzy; Kotulska, Malgorzata; Unold, Olgierd
2013-01-17
Amyloids are proteins capable of forming fibrils. Many of them underlie serious diseases, like Alzheimer disease. The number of amyloid-associated diseases is constantly increasing. Recent studies indicate that amyloidogenic properties can be associated with short segments of aminoacids, which transform the structure when exposed. A few hundreds of such peptides have been experimentally found. Experimental testing of all possible aminoacid combinations is currently not feasible. Instead, they can be predicted by computational methods. 3D profile is a physicochemical-based method that has generated the most numerous dataset - ZipperDB. However, it is computationally very demanding. Here, we show that dataset generation can be accelerated. Two methods to increase the classification efficiency of amyloidogenic candidates are presented and tested: simplified 3D profile generation and machine learning methods. We generated a new dataset of hexapeptides, using more economical 3D profile algorithm, which showed very good classification overlap with ZipperDB (93.5%). The new part of our dataset contains 1779 segments, with 204 classified as amyloidogenic. The dataset of 6-residue sequences with their binary classification, based on the energy of the segment, was applied for training machine learning methods. A separate set of sequences from ZipperDB was used as a test set. The most effective methods were Alternating Decision Tree and Multilayer Perceptron. Both methods obtained area under ROC curve of 0.96, accuracy 91%, true positive rate ca. 78%, and true negative rate 95%. A few other machine learning methods also achieved a good performance. The computational time was reduced from 18-20 CPU-hours (full 3D profile) to 0.5 CPU-hours (simplified 3D profile) to seconds (machine learning). We showed that the simplified profile generation method does not introduce an error with regard to the original method, while increasing the computational efficiency. Our new dataset proved representative enough to use simple statistical methods for testing the amylogenicity based only on six letter sequences. Statistical machine learning methods such as Alternating Decision Tree and Multilayer Perceptron can replace the energy based classifier, with advantage of very significantly reduced computational time and simplicity to perform the analysis. Additionally, a decision tree provides a set of very easily interpretable rules.
NASA Astrophysics Data System (ADS)
Zhu, Lianqing; Chen, Yunfang; Chen, Qingshan; Meng, Hao
2011-05-01
According to minimum zone condition, a method for evaluating the profile error of Archimedes helicoid surface based on Genetic Algorithm (GA) is proposed. The mathematic model of the surface is provided and the unknown parameters in the equation of surface are acquired through least square method. Principle of GA is explained. Then, the profile error of Archimedes Helicoid surface is obtained through GA optimization method. To validate the proposed method, the profile error of an Archimedes helicoid surface, Archimedes Cylindrical worm (ZA worm) surface, is evaluated. The results show that the proposed method is capable of correctly evaluating the profile error of Archimedes helicoid surface and satisfy the evaluation standard of the Minimum Zone Method. It can be applied to deal with the measured data of profile error of complex surface obtained by three coordinate measurement machines (CMM).
Kamoun, Choumouss; Payen, Thibaut; Hua-Van, Aurélie; Filée, Jonathan
2013-10-11
Insertion Sequences (ISs) and their non-autonomous derivatives (MITEs) are important components of prokaryotic genomes inducing duplication, deletion, rearrangement or lateral gene transfers. Although ISs and MITEs are relatively simple and basic genetic elements, their detection remains a difficult task due to their remarkable sequence diversity. With the advent of high-throughput genome and metagenome sequencing technologies, the development of fast, reliable and sensitive methods of ISs and MITEs detection become an important challenge. So far, almost all studies dealing with prokaryotic transposons have used classical BLAST-based detection methods against reference libraries. Here we introduce alternative methods of detection either taking advantages of the structural properties of the elements (de novo methods) or using an additional library-based method using profile HMM searches. In this study, we have developed three different work flows dedicated to ISs and MITEs detection: the first two use de novo methods detecting either repeated sequences or presence of Inverted Repeats; the third one use 28 in-house transposase alignment profiles with HMM search methods. We have compared the respective performances of each method using a reference dataset of 30 archaeal and 30 bacterial genomes in addition to simulated and real metagenomes. Compared to a BLAST-based method using ISFinder as library, de novo methods significantly improve ISs and MITEs detection. For example, in the 30 archaeal genomes, we discovered 30 new elements (+20%) in addition to the 141 multi-copies elements already detected by the BLAST approach. Many of the new elements correspond to ISs belonging to unknown or highly divergent families. The total number of MITEs has even doubled with the discovery of elements displaying very limited sequence similarities with their respective autonomous partners (mainly in the Inverted Repeats of the elements). Concerning metagenomes, with the exception of short reads data (<300 bp) for which both techniques seem equally limited, profile HMM searches considerably ameliorate the detection of transposase encoding genes (up to +50%) generating low level of false positives compare to BLAST-based methods. Compared to classical BLAST-based methods, the sensitivity of de novo and profile HMM methods developed in this study allow a better and more reliable detection of transposons in prokaryotic genomes and metagenomes. We believed that future studies implying ISs and MITEs identification in genomic data should combine at least one de novo and one library-based method, with optimal results obtained by running the two de novo methods in addition to a library-based search. For metagenomic data, profile HMM search should be favored, a BLAST-based step is only useful to the final annotation into groups and families.
Evolutionary profiles from the QR factorization of multiple sequence alignments
Sethi, Anurag; O'Donoghue, Patrick; Luthey-Schulten, Zaida
2005-01-01
We present an algorithm to generate complete evolutionary profiles that represent the topology of the molecular phylogenetic tree of the homologous group. The method, based on the multidimensional QR factorization of numerically encoded multiple sequence alignments, removes redundancy from the alignments and orders the protein sequences by increasing linear dependence, resulting in the identification of a minimal basis set of sequences that spans the evolutionary space of the homologous group of proteins. We observe a general trend that these smaller, more evolutionarily balanced profiles have comparable and, in many cases, better performance in database searches than conventional profiles containing hundreds of sequences, constructed in an iterative and computationally intensive procedure. For more diverse families or superfamilies, with sequence identity <30%, structural alignments, based purely on the geometry of the protein structures, provide better alignments than pure sequence-based methods. Merging the structure and sequence information allows the construction of accurate profiles for distantly related groups. These structure-based profiles outperformed other sequence-based methods for finding distant homologs and were used to identify a putative class II cysteinyl-tRNA synthetase (CysRS) in several archaea that eluded previous annotation studies. Phylogenetic analysis showed the putative class II CysRSs to be a monophyletic group and homology modeling revealed a constellation of active site residues similar to that in the known class I CysRS. PMID:15741270
NASA Astrophysics Data System (ADS)
Xie, Xian-Hua; Yu, Zu-Guo; Ma, Yuan-Lin; Han, Guo-Sheng; Anh, Vo
2017-09-01
There has been a growing interest in visualization of metagenomic data. The present study focuses on the visualization of metagenomic data using inter-nucleotide distances profile. We first convert the fragment sequences into inter-nucleotide distances profiles. Then we analyze these profiles by principal component analysis. Finally the principal components are used to obtain the 2-D scattered plot according to their source of species. We name our method as inter-nucleotide distances profiles (INP) method. Our method is evaluated on three benchmark data sets used in previous published papers. Our results demonstrate that the INP method is good, alternative and efficient for visualization of metagenomic data.
Turbine blade profile design method based on Bezier curves
NASA Astrophysics Data System (ADS)
Alexeev, R. A.; Tishchenko, V. A.; Gribin, V. G.; Gavrilov, I. Yu.
2017-11-01
In this paper, the technique of two-dimensional parametric blade profile design is presented. Bezier curves are used to create the profile geometry. The main feature of the proposed method is an adaptive approach of curve fitting to given geometric conditions. Calculation of the profile shape is produced by multi-dimensional minimization method with a number of restrictions imposed on the blade geometry.The proposed method has been used to describe parametric geometry of known blade profile. Then the baseline geometry was modified by varying some parameters of the blade. The numerical calculation of obtained designs has been carried out. The results of calculations have shown the efficiency of chosen approach.
Innovative design method of automobile profile based on Fourier descriptor
NASA Astrophysics Data System (ADS)
Gao, Shuyong; Fu, Chaoxing; Xia, Fan; Shen, Wei
2017-10-01
Aiming at the innovation of the contours of automobile side, this paper presents an innovative design method of vehicle side profile based on Fourier descriptor. The design flow of this design method is: pre-processing, coordinate extraction, standardization, discrete Fourier transform, simplified Fourier descriptor, exchange descriptor innovation, inverse Fourier transform to get the outline of innovative design. Innovative concepts of the innovative methods of gene exchange among species and the innovative methods of gene exchange among different species are presented, and the contours of the innovative design are obtained separately. A three-dimensional model of a car is obtained by referring to the profile curve which is obtained by exchanging xenogeneic genes. The feasibility of the method proposed in this paper is verified by various aspects.
Macarthur, Roy; Feinberg, Max; Bertheau, Yves
2010-01-01
A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be < 0.45% to demonstrate compliance, and > 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%.
Kim, Jaehee; Ogden, Robert Todd; Kim, Haseong
2013-10-18
Time course gene expression experiments are an increasingly popular method for exploring biological processes. Temporal gene expression profiles provide an important characterization of gene function, as biological systems are both developmental and dynamic. With such data it is possible to study gene expression changes over time and thereby to detect differential genes. Much of the early work on analyzing time series expression data relied on methods developed originally for static data and thus there is a need for improved methodology. Since time series expression is a temporal process, its unique features such as autocorrelation between successive points should be incorporated into the analysis. This work aims to identify genes that show different gene expression profiles across time. We propose a statistical procedure to discover gene groups with similar profiles using a nonparametric representation that accounts for the autocorrelation in the data. In particular, we first represent each profile in terms of a Fourier basis, and then we screen out genes that are not differentially expressed based on the Fourier coefficients. Finally, we cluster the remaining gene profiles using a model-based approach in the Fourier domain. We evaluate the screening results in terms of sensitivity, specificity, FDR and FNR, compare with the Gaussian process regression screening in a simulation study and illustrate the results by application to yeast cell-cycle microarray expression data with alpha-factor synchronization.The key elements of the proposed methodology: (i) representation of gene profiles in the Fourier domain; (ii) automatic screening of genes based on the Fourier coefficients and taking into account autocorrelation in the data, while controlling the false discovery rate (FDR); (iii) model-based clustering of the remaining gene profiles. Using this method, we identified a set of cell-cycle-regulated time-course yeast genes. The proposed method is general and can be potentially used to identify genes which have the same patterns or biological processes, and help facing the present and forthcoming challenges of data analysis in functional genomics.
Bandgap profiling in CIGS solar cells via valence electron energy-loss spectroscopy
NASA Astrophysics Data System (ADS)
Deitz, Julia I.; Karki, Shankar; Marsillac, Sylvain X.; Grassman, Tyler J.; McComb, David W.
2018-03-01
A robust, reproducible method for the extraction of relative bandgap trends from scanning transmission electron microscopy (STEM) based electron energy-loss spectroscopy (EELS) is described. The effectiveness of the approach is demonstrated by profiling the bandgap through a CuIn1-xGaxSe2 solar cell that possesses intentional Ga/(In + Ga) composition variation. The EELS-determined bandgap profile is compared to the nominal profile calculated from compositional data collected via STEM-based energy dispersive X-ray spectroscopy. The EELS based profile is found to closely track the calculated bandgap trends, with only a small, fixed offset difference. This method, which is particularly advantageous for relatively narrow bandgap materials and/or STEM systems with modest resolution capabilities (i.e., >100 meV), compromises absolute accuracy to provide a straightforward route for the correlation of local electronic structure trends with nanoscale chemical and physical structure/microstructure within semiconductor materials and devices.
A Method for Evaluation of Model-Generated Vertical Profiles of Meteorological Variables
2016-03-01
3 2.1 RAOB Soundings and WRF Output for Profile Generation 3 2.2 Height-Based Profiles 5 2.3 Pressure-Based Profiles 5 3. Comparisons 8 4...downward arrow. The blue lines represent sublayers with sublayer means indicated by red triangles. Circles indicate the observations or WRF output...9 Table 3 Sample of differences in listed variables derived from WRF and RAOB data
Sarilita, Erli; Rynn, Christopher; Mossey, Peter A; Black, Sue; Oscandar, Fahmi
2018-05-01
This study investigated nose profile morphology and its relationship to the skull in Scottish subadult and Indonesian adult populations, with the aim of improving the accuracy of forensic craniofacial reconstruction. Samples of 86 lateral head cephalograms from Dundee Dental School (mean age, 11.8 years) and 335 lateral head cephalograms from the Universitas Padjadjaran Dental Hospital, Bandung, Indonesia (mean age 24.2 years), were measured. The method of nose profile estimation based on skull morphology previously proposed by Rynn and colleagues in 2010 (FSMP 6:20-34) was tested in this study. Following this method, three nasal aperture-related craniometrics and six nose profile dimensions were measured from the cephalograms. To assess the accuracy of the method, six nose profile dimensions were estimated from the three craniometric parameters using the published method and then compared to the actual nose profile dimensions.In the Scottish subadult population, no sexual dimorphism was evident in the measured dimensions. In contrast, sexual dimorphism of the Indonesian adult population was evident in all craniometric and nose profile dimensions; notably, males exhibited statistically significant larger values than females. The published method by Rynn and colleagues (FSMP 6:20-34, 2010) performed better in the Scottish subadult population (mean difference of maximum, 2.35 mm) compared to the Indonesian adult population (mean difference of maximum, 5.42 mm in males and 4.89 mm in females).In addition, regression formulae were derived to estimate nose profile dimensions based on the craniometric measurements for the Indonesian adult population. The published method is not sufficiently accurate for use on the Indonesian population, so the derived method should be used. The accuracy of the published method by Rynn and colleagues (FSMP 6:20-34, 2010) was sufficiently reliable to be applied in Scottish subadult population.
Fujibuchi, Wataru; Anderson, John S. J.; Landsman, David
2001-01-01
Consensus pattern and matrix-based searches designed to predict cis-acting transcriptional regulatory sequences have historically been subject to large numbers of false positives. We sought to decrease false positives by incorporating expression profile data into a consensus pattern-based search method. We have systematically analyzed the expression phenotypes of over 6000 yeast genes, across 121 expression profile experiments, and correlated them with the distribution of 14 known regulatory elements over sequences upstream of the genes. Our method is based on a metric we term probabilistic element assessment (PEA), which is a ranking of potential sites based on sequence similarity in the upstream regions of genes with similar expression phenotypes. For eight of the 14 known elements that we examined, our method had a much higher selectivity than a naïve consensus pattern search. Based on our analysis, we have developed a web-based tool called PROSPECT, which allows consensus pattern-based searching of gene clusters obtained from microarray data. PMID:11574681
Use of a genetic algorithm to improve the rail profile on Stockholm underground
NASA Astrophysics Data System (ADS)
Persson, Ingemar; Nilsson, Rickard; Bik, Ulf; Lundgren, Magnus; Iwnicki, Simon
2010-12-01
In this paper, a genetic algorithm optimisation method has been used to develop an improved rail profile for Stockholm underground. An inverted penalty index based on a number of key performance parameters was generated as a fitness function and vehicle dynamics simulations were carried out with the multibody simulation package Gensys. The effectiveness of each profile produced by the genetic algorithm was assessed using the roulette wheel method. The method has been applied to the rail profile on the Stockholm underground, where problems with rolling contact fatigue on wheels and rails are currently managed by grinding. From a starting point of the original BV50 and the UIC60 rail profiles, an optimised rail profile with some shoulder relief has been produced. The optimised profile seems similar to measured rail profiles on the Stockholm underground network and although initial grinding is required, maintenance of the profile will probably not require further grinding.
Laser focal profiler based on forward scattering of a nanoparticle
NASA Astrophysics Data System (ADS)
Ota, Taisuke
2018-03-01
A laser focal intensity profiling method based on the forward scattering from a nanoparticle is demonstrated for in situ measurements using a laser focusing system with six microscope objective lenses with different numerical apertures ranging from 0.15 to 1.4. The measured profiles showed Airy disc patterns although their rings showed some imperfections due to aberrations and misalignment of the test system. The dipole radiation model revealed that the artefact of this method was much smaller than the influence of the deterioration in the experimental system; a condition where no artefact appears was predicted based on proper selection of measurement angles.
Exploring student learning profiles in algebra-based studio physics: A person-centered approach
NASA Astrophysics Data System (ADS)
Pond, Jarrad W. T.; Chini, Jacquelyn J.
2017-06-01
In this study, we explore the strategic self-regulatory and motivational characteristics of students in studio-mode physics courses at three universities with varying student populations and varying levels of success in their studio-mode courses. We survey students using questions compiled from several existing questionnaires designed to measure students' study strategies, attitudes toward and motivations for learning physics, organization of scientific knowledge, experiences outside the classroom, and demographics. Using a person-centered approach, we utilize cluster analysis methods to group students into learning profiles based on their individual responses to better understand the strategies and motives of algebra-based studio physics students. Previous studies have identified five distinct learning profiles across several student populations using similar methods. We present results from first-semester and second-semester studio-mode introductory physics courses across three universities. We identify these five distinct learning profiles found in previous studies to be present within our population of introductory physics students. In addition, we investigate interactions between these learning profiles and student demographics. We find significant interactions between a student's learning profile and their experience with high school physics, major, gender, grade expectation, and institution. Ultimately, we aim to use this method of analysis to take the characteristics of students into account in the investigation of successful strategies for using studio methods of physics instruction within and across institutions.
HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features.
Zaman, Rianon; Chowdhury, Shahana Yasmin; Rashid, Mahmood A; Sharma, Alok; Dehzangi, Abdollah; Shatabda, Swakkhar
2017-01-01
DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM) as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.
ERIC Educational Resources Information Center
Borsuk, Ellen R.; Watkins, Marley W.; Canivez, Gary L.
2006-01-01
Although often applied in practice, clinically based cognitive subtest profile analysis has failed to achieve empirical support. Nonlinear multivariate subtest profile analysis may have benefits over clinically based techniques, but the psychometric properties of these methods must be studied prior to their implementation and interpretation. The…
The mathematical and computer modeling of the worm tool shaping
NASA Astrophysics Data System (ADS)
Panchuk, K. L.; Lyashkov, A. A.; Ayusheev, T. V.
2017-06-01
Traditionally mathematical profiling of the worm tool is carried out on the first T. Olivier method, known in the theory of gear gearings, with receiving an intermediate surface of the making lath. It complicates process of profiling and its realization by means of computer 3D-modeling. The purpose of the work is the improvement of mathematical model of profiling and its realization based on the methods of 3D-modeling. Research problems are: receiving of the mathematical model of profiling which excludes the presence of the making lath in it; realization of the received model by means of frame and superficial modeling; development and approbation of technology of solid-state modeling for the solution of the problem of profiling. As the basic, the kinematic method of research of the mutually envelope surfaces is accepted. Computer research is executed by means of CAD based on the methods of 3D-modeling. We have developed mathematical model of profiling of the worm tool; frame, superficial and solid-state models of shaping of the mutually enveloping surfaces of the detail and the tool are received. The offered mathematical models and the technologies of 3D-modeling of shaping represent tools for theoretical and experimental profiling of the worm tool. The results of researches can be used at design of metal-cutting tools.
Characterizing Task-Based OpenMP Programs
Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats
2015-01-01
Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023
Zhang, Ao; Tian, Suyan
2018-05-01
Pathway-based feature selection algorithms, which utilize biological information contained in pathways to guide which features/genes should be selected, have evolved quickly and become widespread in the field of bioinformatics. Based on how the pathway information is incorporated, we classify pathway-based feature selection algorithms into three major categories-penalty, stepwise forward, and weighting. Compared to the first two categories, the weighting methods have been underutilized even though they are usually the simplest ones. In this article, we constructed three different genes' connectivity information-based weights for each gene and then conducted feature selection upon the resulting weighted gene expression profiles. Using both simulations and a real-world application, we have demonstrated that when the data-driven connectivity information constructed from the data of specific disease under study is considered, the resulting weighted gene expression profiles slightly outperform the original expression profiles. In summary, a big challenge faced by the weighting method is how to estimate pathway knowledge-based weights more accurately and precisely. Only until the issue is conquered successfully will wide utilization of the weighting methods be impossible. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS.
Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young
2015-11-09
In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry.
Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS
Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young
2015-01-01
In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry. PMID:26569219
Han, Wenhua; Shen, Xiaohui; Xu, Jun; Wang, Ping; Tian, Guiyun; Wu, Zhengyang
2014-01-01
Magnetic flux leakage (MFL) inspection is one of the most important and sensitive nondestructive testing approaches. For online MFL inspection of a long-range railway track or oil pipeline, a fast and effective defect profile estimating method based on a multi-power affine projection algorithm (MAPA) is proposed, where the depth of a sampling point is related with not only the MFL signals before it, but also the ones after it, and all of the sampling points related to one point appear as serials or multi-power. Defect profile estimation has two steps: regulating a weight vector in an MAPA filter and estimating a defect profile with the MAPA filter. Both simulation and experimental data are used to test the performance of the proposed method. The results demonstrate that the proposed method exhibits high speed while maintaining the estimated profiles clearly close to the desired ones in a noisy environment, thereby meeting the demand of accurate online inspection. PMID:25192314
Han, Wenhua; Shen, Xiaohui; Xu, Jun; Wang, Ping; Tian, Guiyun; Wu, Zhengyang
2014-09-04
Magnetic flux leakage (MFL) inspection is one of the most important and sensitive nondestructive testing approaches. For online MFL inspection of a long-range railway track or oil pipeline, a fast and effective defect profile estimating method based on a multi-power affine projection algorithm (MAPA) is proposed, where the depth of a sampling point is related with not only the MFL signals before it, but also the ones after it, and all of the sampling points related to one point appear as serials or multi-power. Defect profile estimation has two steps: regulating a weight vector in an MAPA filter and estimating a defect profile with the MAPA filter. Both simulation and experimental data are used to test the performance of the proposed method. The results demonstrate that the proposed method exhibits high speed while maintaining the estimated profiles clearly close to the desired ones in a noisy environment, thereby meeting the demand of accurate online inspection.
NASA Astrophysics Data System (ADS)
Offret, J.-P.; Lebedinsky, J.; Navello, L.; Pina, V.; Serio, B.; Bailly, Y.; Hervé, P.
2015-05-01
Temperature data play an important role in the combustion chamber since it determines both the efficiency and the rate of pollutants emission of engines. Air pollution problem concerns the emissions of gases such as CO, CO2, NO, NO2, SO2 and also aerosols, soot and volatile organic compounds. Flame combustion occurs in hostile environments where temperature and concentration profiles are often not easy to measure. In this study, a temperature and CO2 concentration profiles optical measurement method, suitable for combustion analysis, is discussed and presented. The proposed optical metrology method presents numerous advantages when compared to intrusive methods. The experimental setup comprises a passive radiative emission measurement method combined with an active laser-measurement method. The passive method is based on the use of gas emission spectroscopy. The experimental spectrometer device is coupled with an active method. The active method is used to investigate and correct complex flame profiles. This method similar to a LIDAR (Light Detection And Ranging) device is based on the measurement of Rayleigh scattering of a short laser pulse recorded using a high-speed streak camera. The whole experimental system of this new method is presented. Results obtained on a small-scale turbojet are shown and discussed in order to illustrate the potentials deliver by the sophisticated method. Both temperature and concentration profiles of the gas jet are presented and discussed.
Accelerating Information Retrieval from Profile Hidden Markov Model Databases.
Tamimi, Ahmad; Ashhab, Yaqoub; Tamimi, Hashem
2016-01-01
Profile Hidden Markov Model (Profile-HMM) is an efficient statistical approach to represent protein families. Currently, several databases maintain valuable protein sequence information as profile-HMMs. There is an increasing interest to improve the efficiency of searching Profile-HMM databases to detect sequence-profile or profile-profile homology. However, most efforts to enhance searching efficiency have been focusing on improving the alignment algorithms. Although the performance of these algorithms is fairly acceptable, the growing size of these databases, as well as the increasing demand for using batch query searching approach, are strong motivations that call for further enhancement of information retrieval from profile-HMM databases. This work presents a heuristic method to accelerate the current profile-HMM homology searching approaches. The method works by cluster-based remodeling of the database to reduce the search space, rather than focusing on the alignment algorithms. Using different clustering techniques, 4284 TIGRFAMs profiles were clustered based on their similarities. A representative for each cluster was assigned. To enhance sensitivity, we proposed an extended step that allows overlapping among clusters. A validation benchmark of 6000 randomly selected protein sequences was used to query the clustered profiles. To evaluate the efficiency of our approach, speed and recall values were measured and compared with the sequential search approach. Using hierarchical, k-means, and connected component clustering techniques followed by the extended overlapping step, we obtained an average reduction in time of 41%, and an average recall of 96%. Our results demonstrate that representation of profile-HMMs using a clustering-based approach can significantly accelerate data retrieval from profile-HMM databases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ha, Gwanghui; Cho, Moo-Hyun; Conde, Manoel
Emittance exchange (EEX) based longitudinal current profile shaping is the one of the promising current profile shaping technique. This method can generate high quality arbitrary current profiles under the ideal conditions. The double dog-leg EEX beam line was recently installed at the Argonne Wakefield Accelerator (AWA) to explore the shaping capability and confirm the quality of this method. To demonstrate the arbitrary current profile generation, several different transverse masks are applied to generate different final current profiles. The phase space slopes and the charge of incoming beam are varied to observe and suppress the aberrations on the ideal profile. Wemore » present current profile shaping results, aberrations on the shaped profile, and its suppression.« less
2013-01-01
Background Time course gene expression experiments are an increasingly popular method for exploring biological processes. Temporal gene expression profiles provide an important characterization of gene function, as biological systems are both developmental and dynamic. With such data it is possible to study gene expression changes over time and thereby to detect differential genes. Much of the early work on analyzing time series expression data relied on methods developed originally for static data and thus there is a need for improved methodology. Since time series expression is a temporal process, its unique features such as autocorrelation between successive points should be incorporated into the analysis. Results This work aims to identify genes that show different gene expression profiles across time. We propose a statistical procedure to discover gene groups with similar profiles using a nonparametric representation that accounts for the autocorrelation in the data. In particular, we first represent each profile in terms of a Fourier basis, and then we screen out genes that are not differentially expressed based on the Fourier coefficients. Finally, we cluster the remaining gene profiles using a model-based approach in the Fourier domain. We evaluate the screening results in terms of sensitivity, specificity, FDR and FNR, compare with the Gaussian process regression screening in a simulation study and illustrate the results by application to yeast cell-cycle microarray expression data with alpha-factor synchronization. The key elements of the proposed methodology: (i) representation of gene profiles in the Fourier domain; (ii) automatic screening of genes based on the Fourier coefficients and taking into account autocorrelation in the data, while controlling the false discovery rate (FDR); (iii) model-based clustering of the remaining gene profiles. Conclusions Using this method, we identified a set of cell-cycle-regulated time-course yeast genes. The proposed method is general and can be potentially used to identify genes which have the same patterns or biological processes, and help facing the present and forthcoming challenges of data analysis in functional genomics. PMID:24134721
Exploring personalized searches using tag-based user profiles and resource profiles in folksonomy.
Cai, Yi; Li, Qing; Xie, Haoran; Min, Huaqin
2014-10-01
With the increase in resource-sharing websites such as YouTube and Flickr, many shared resources have arisen on the Web. Personalized searches have become more important and challenging since users demand higher retrieval quality. To achieve this goal, personalized searches need to take users' personalized profiles and information needs into consideration. Collaborative tagging (also known as folksonomy) systems allow users to annotate resources with their own tags, which provides a simple but powerful way for organizing, retrieving and sharing different types of social resources. In this article, we examine the limitations of previous tag-based personalized searches. To handle these limitations, we propose a new method to model user profiles and resource profiles in collaborative tagging systems. We use a normalized term frequency to indicate the preference degree of a user on a tag. A novel search method using such profiles of users and resources is proposed to facilitate the desired personalization in resource searches. In our framework, instead of the keyword matching or similarity measurement used in previous works, the relevance measurement between a resource and a user query (termed the query relevance) is treated as a fuzzy satisfaction problem of a user's query requirements. We implement a prototype system called the Folksonomy-based Multimedia Retrieval System (FMRS). Experiments using the FMRS data set and the MovieLens data set show that our proposed method outperforms baseline methods. Copyright © 2014 Elsevier Ltd. All rights reserved.
Mass Spectrometry Based Ultrasensitive DNA Methylation Profiling Using Target Fragmentation Assay.
Lin, Xiang-Cheng; Zhang, Ting; Liu, Lan; Tang, Hao; Yu, Ru-Qin; Jiang, Jian-Hui
2016-01-19
Efficient tools for profiling DNA methylation in specific genes are essential for epigenetics and clinical diagnostics. Current DNA methylation profiling techniques have been limited by inconvenient implementation, requirements of specific reagents, and inferior accuracy in quantifying methylation degree. We develop a novel mass spectrometry method, target fragmentation assay (TFA), which enable to profile methylation in specific sequences. This method combines selective capture of DNA target from restricted cleavage of genomic DNA using magnetic separation with MS detection of the nonenzymatic hydrolysates of target DNA. This method is shown to be highly sensitive with a detection limit as low as 0.056 amol, allowing direct profiling of methylation using genome DNA without preamplification. Moreover, this method offers a unique advantage in accurately determining DNA methylation level. The clinical applicability was demonstrated by DNA methylation analysis using prostate tissue samples, implying the potential of this method as a useful tool for DNA methylation profiling in early detection of related diseases.
Atmospheric turbulence profiling with unknown power spectral density
NASA Astrophysics Data System (ADS)
Helin, Tapio; Kindermann, Stefan; Lehtonen, Jonatan; Ramlau, Ronny
2018-04-01
Adaptive optics (AO) is a technology in modern ground-based optical telescopes to compensate for the wavefront distortions caused by atmospheric turbulence. One method that allows to retrieve information about the atmosphere from telescope data is so-called SLODAR, where the atmospheric turbulence profile is estimated based on correlation data of Shack-Hartmann wavefront measurements. This approach relies on a layered Kolmogorov turbulence model. In this article, we propose a novel extension of the SLODAR concept by including a general non-Kolmogorov turbulence layer close to the ground with an unknown power spectral density. We prove that the joint estimation problem of the turbulence profile above ground simultaneously with the unknown power spectral density at the ground is ill-posed and propose three numerical reconstruction methods. We demonstrate by numerical simulations that our methods lead to substantial improvements in the turbulence profile reconstruction compared to the standard SLODAR-type approach. Also, our methods can accurately locate local perturbations in non-Kolmogorov power spectral densities.
Retrieving cloudy atmosphere parameters from RPG-HATPRO radiometer data
NASA Astrophysics Data System (ADS)
Kostsov, V. S.
2015-03-01
An algorithm for simultaneously determining both tropospheric temperature and humidity profiles and cloud liquid water content from ground-based measurements of microwave radiation is presented. A special feature of this algorithm is that it combines different types of measurements and different a priori information on the sought parameters. The features of its use in processing RPG-HATPRO radiometer data obtained in the course of atmospheric remote sensing experiments carried out by specialists from the Faculty of Physics of St. Petersburg State University are discussed. The results of a comparison of both temperature and humidity profiles obtained using a ground-based microwave remote sensing method with those obtained from radiosonde data are analyzed. It is shown that this combined algorithm is comparable (in accuracy) to the classical method of statistical regularization in determining temperature profiles; however, this algorithm demonstrates better accuracy (when compared to the method of statistical regularization) in determining humidity profiles.
iPcc: a novel feature extraction method for accurate disease class discovery and prediction
Ren, Xianwen; Wang, Yong; Zhang, Xiang-Sun; Jin, Qi
2013-01-01
Gene expression profiling has gradually become a routine procedure for disease diagnosis and classification. In the past decade, many computational methods have been proposed, resulting in great improvements on various levels, including feature selection and algorithms for classification and clustering. In this study, we present iPcc, a novel method from the feature extraction perspective to further propel gene expression profiling technologies from bench to bedside. We define ‘correlation feature space’ for samples based on the gene expression profiles by iterative employment of Pearson’s correlation coefficient. Numerical experiments on both simulated and real gene expression data sets demonstrate that iPcc can greatly highlight the latent patterns underlying noisy gene expression data and thus greatly improve the robustness and accuracy of the algorithms currently available for disease diagnosis and classification based on gene expression profiles. PMID:23761440
Surface Profile and Stress Field Evaluation using Digital Gradient Sensing Method
Miao, C.; Sundaram, B. M.; Huang, L.; ...
2016-08-09
Shape and surface topography evaluation from measured orthogonal slope/gradient data is of considerable engineering significance since many full-field optical sensors and interferometers readily output accurate data of that kind. This has applications ranging from metrology of optical and electronic elements (lenses, silicon wafers, thin film coatings), surface profile estimation, wave front and shape reconstruction, to name a few. In this context, a new methodology for surface profile and stress field determination based on a recently introduced non-contact, full-field optical method called digital gradient sensing (DGS) capable of measuring small angular deflections of light rays coupled with a robust finite-difference-based least-squaresmore » integration (HFLI) scheme in the Southwell configuration is advanced here. The method is demonstrated by evaluating (a) surface profiles of mechanically warped silicon wafers and (b) stress gradients near growing cracks in planar phase objects.« less
Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications
ERIC Educational Resources Information Center
Pabon, Peter; Ternstrom, Sten; Lamarche, Anick
2011-01-01
Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…
Amatore, Christian; Oleinick, Alexander; Klymenko, Oleksiy V; Svir, Irina
2005-08-12
Herein, we propose a method for reconstructing any plausible macroscopic hydrodynamic flow profile occurring locally within a rectangular microfluidic channel. The method is based on experimental currents measured at single or double microband electrodes embedded in one channel wall. A perfectly adequate quasiconformal mapping of spatial coordinates introduced in our previous work [Electrochem. Commun. 2004, 6, 1123] and an exponentially expanding time grid, initially proposed [J. Electroanal. Chem. 2003, 557, 75] in conjunction with the solution of the corresponding variational problem approached by the Ritz method are used for the numerical reconstruction of flow profiles. Herein, the concept of the method is presented and developed theoretically and its validity is tested on the basis of the use of pseudoexperimental currents emulated by simulation of the diffusion-convection problem in a channel flow cell, to which a random Gaussian current noise is added. The flow profiles reconstructed by our method compare successfully with those introduced a priori into the simulations, even when these include significant distortions compared with either classical Poiseuille or electro-osmotic flows.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
de Groot, Reinoud; Lüthi, Joel; Lindsay, Helen; Holtackers, René; Pelkmans, Lucas
2018-01-23
High-content imaging using automated microscopy and computer vision allows multivariate profiling of single-cell phenotypes. Here, we present methods for the application of the CISPR-Cas9 system in large-scale, image-based, gene perturbation experiments. We show that CRISPR-Cas9-mediated gene perturbation can be achieved in human tissue culture cells in a timeframe that is compatible with image-based phenotyping. We developed a pipeline to construct a large-scale arrayed library of 2,281 sequence-verified CRISPR-Cas9 targeting plasmids and profiled this library for genes affecting cellular morphology and the subcellular localization of components of the nuclear pore complex (NPC). We conceived a machine-learning method that harnesses genetic heterogeneity to score gene perturbations and identify phenotypically perturbed cells for in-depth characterization of gene perturbation effects. This approach enables genome-scale image-based multivariate gene perturbation profiling using CRISPR-Cas9. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.
Sun, Mingqian; Liu, Jianxun; Lin, Chengren; Miao, Lan; Lin, Li
2014-01-01
Since alkaloids are the major active constituents of Rhizoma corydalis (RC), a convenient and accurate analytical method is needed for their identification and characterization. Here we report a method to profile the alkaloids in RC based on liquid chromatography-tandem quadrupole time-of-flight mass spectrometry (LC–Q-TOF-MS/MS). A total of 16 alkaloids belonging to four different classes were identified by comparison with authentic standards. The fragmentation pathway of each class of alkaloid was clarified and their differences were elucidated. Furthermore, based on an analysis of fragmentation pathways and alkaloid profiling, a rapid and accurate method for the identification of unknown alkaloids in RC is proposed. The method could also be useful for the quality control of RC. PMID:26579385
Shi, Lifang; Du, Chunlei; Dong, Xiaochun; Deng, Qiling; Luo, Xiangang
2007-12-01
An aperiodic mask design method for fabricating a microlens array with an aspherical profile is proposed. The nonlinear relationship between exposure doses and lens profile is considered, and the select criteria of quantization interval and fabrication range of the method are given. The mask function of a quadrangle microlens array with a hyperboloid profile used in the infrared was constructed by using this method. The microlens array can be effectively fabricated during a one time exposure process using the mask. Reactive ion etching was carried out to transfer the structure into the substrate of germanium. The measurement results indicate that the roughness is less than 10 nm (pv), and the profile error is less than 40 nm (rms).
Villeneuve-Faure, C; Boudou, L; Makasheva, K; Teyssedre, G
2017-12-15
To understand the physical phenomena occurring at metal/dielectric interfaces, determination of the charge density profile at nanoscale is crucial. To deal with this issue, charges were injected applying a DC voltage on lateral Al-electrodes embedded in a SiN x thin dielectric layer. The surface potential induced by the injected charges was probed by Kelvin probe force microscopy (KPFM). It was found that the KPFM frequency mode is a better adapted method to probe accurately the charge profile. To extract the charge density profile from the surface potential two numerical approaches based on the solution to Poisson's equation for electrostatics were investigated: the second derivative model method, already reported in the literature, and a new 2D method based on the finite element method (FEM). Results highlight that the FEM is more robust to noise or artifacts in the case of a non-flat initial surface potential. Moreover, according to theoretical study the FEM appears to be a good candidate for determining charge density in dielectric films with thicknesses in the range from 10 nm to 10 μm. By applying this method, the charge density profile was determined at nanoscale, highlighting that the charge cloud remains close to the interface.
NASA Astrophysics Data System (ADS)
Villeneuve-Faure, C.; Boudou, L.; Makasheva, K.; Teyssedre, G.
2017-12-01
To understand the physical phenomena occurring at metal/dielectric interfaces, determination of the charge density profile at nanoscale is crucial. To deal with this issue, charges were injected applying a DC voltage on lateral Al-electrodes embedded in a SiN x thin dielectric layer. The surface potential induced by the injected charges was probed by Kelvin probe force microscopy (KPFM). It was found that the KPFM frequency mode is a better adapted method to probe accurately the charge profile. To extract the charge density profile from the surface potential two numerical approaches based on the solution to Poisson’s equation for electrostatics were investigated: the second derivative model method, already reported in the literature, and a new 2D method based on the finite element method (FEM). Results highlight that the FEM is more robust to noise or artifacts in the case of a non-flat initial surface potential. Moreover, according to theoretical study the FEM appears to be a good candidate for determining charge density in dielectric films with thicknesses in the range from 10 nm to 10 μm. By applying this method, the charge density profile was determined at nanoscale, highlighting that the charge cloud remains close to the interface.
Bansback, Nick; Sizto, Sonia; Guh, Daphne; Anis, Aslam H
2012-10-01
Numerous websites offer direct-to-consumer (DTC) genetic testing, yet it is unknown how individuals will react to genetic risk profiles online. The objective of this study was to determine the feasibility of using a web-based survey and conjoint methods to elicit individuals' interpretations of genetic risk profiles by their anticipated worry/anxiousness and health-seeking behaviors. A web-based survey was developed using conjoint methods. Each survey presented 12 hypothetical genetic risk profiles describing genetic test results for four diseases. Test results were characterized by the type of disease (eight diseases), individual risk (five levels), and research confidence (three levels). After each profile, four questions were asked regarding anticipated worry and health-seeking behaviors. Probabilities of response outcomes based on attribute levels were estimated from logistic regression models, adjusting for covariates. Overall, 319 participants (69%) completed 3828 unique genetic risk profiles. Across all profiles, most participants anticipated making doctor's appointments (63%), lifestyle changes (57%), and accessing screening (57%); 40% anticipated feeling more worried and anxious. Higher levels of disease risk were significantly associated with affirmative responses. Conjoint methods may be used to elicit reactions to genetic information online. Preliminary results suggest that genetic information may increase worry/anxiousness and health-seeking behaviors among consumers of DTC tests. Further research is planned to determine the appropriateness of these affects and behaviors.
NASA Astrophysics Data System (ADS)
Geloni, G.; Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.
2004-08-01
An effective and practical technique based on the detection of the coherent synchrotron radiation (CSR) spectrum can be used to characterize the profile function of ultra-short bunches. The CSR spectrum measurement has an important limitation: no spectral phase information is available, and the complete profile function cannot be obtained in general. In this paper we propose to use constrained deconvolution method for bunch profile reconstruction based on a priori-known information about formation of the electron bunch. Application of the method is illustrated with practically important example of a bunch formed in a single bunch-compressor. Downstream of the bunch compressor the bunch charge distribution is strongly non-Gaussian with a narrow leading peak and a long tail. The longitudinal bunch distribution is derived by measuring the bunch tail constant with a streak camera and by using a priory available information about profile function.
NASA Technical Reports Server (NTRS)
Huang, Norden E. (Inventor)
2001-01-01
A computer implemented method of processing two-dimensional physical signals includes five basic components and the associated presentation techniques of the results. The first component decomposes the two-dimensional signal into one-dimensional profiles. The second component is a computer implemented Empirical Mode Decomposition that extracts a collection of Intrinsic Mode Functions (IMF's) from each profile based on local extrema and/or curvature extrema. The decomposition is based on the direct extraction of the energy associated with various intrinsic time scales in the profiles. In the third component, the IMF's of each profile are then subjected to a Hilbert Transform. The fourth component collates the Hilbert transformed IMF's of the profiles to form a two-dimensional Hilbert Spectrum. A fifth component manipulates the IMF's by, for example, filtering the two-dimensional signal by reconstructing the two-dimensional signal from selected IMF(s).
NASA Astrophysics Data System (ADS)
Acharya, S.; Mylavarapu, R.; Jawitz, J. W.
2012-12-01
In shallow unconfined aquifers, the water table usually shows a distinct diurnal fluctuation pattern corresponding to the twenty-four hour solar radiation cycle. This diurnal water table fluctuation (DWTF) signal can be used to estimate the groundwater evapotranspiration (ETg) by vegetation, a method known as the White [1932] method. Water table fluctuations in shallow phreatic aquifers is controlled by two distinct storage parameters, drainable porosity (or specific yield) and the fillable porosity. Yet, it is implicitly assumed in most studies that these two parameters are equal, unless hysteresis effect is considered. The White based method available in the literature is also based on a single drainable porosity parameter to estimate the ETg. In this study, we present a modification of the White based method to estimate ETg from DWTF using separate drainable (λd) and fillable porosity (λf) parameters. Separate analytical expressions based on successive steady state moisture profiles are used to estimate λd and λf, instead of the commonly employed hydrostatic moisture profile approach. The modified method is then applied to estimate ETg using the DWTF data observed in a field in northeast Florida and the results are compared with ET estimations from the standard Penman-Monteith equation. It is found that the modified method resulted in significantly better estimates of ETg than the previously available method that used only a single, hydrostatic-moisture-profile based λd. Furthermore, the modified method is also used to estimate ETg even during rainfall events which produced significantly better estimates of ETg as compared to the single λd parameter method.
NASA Astrophysics Data System (ADS)
Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish
2015-10-01
Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.
A simple method to incorporate water vapor absorption in the 15 microns remote temperature sounding
NASA Technical Reports Server (NTRS)
Dallu, G.; Prabhakara, C.; Conhath, B. J.
1975-01-01
The water vapor absorption in the 15 micron CO2 band, which can affect the remotely sensed temperatures near the surface, are estimated with the help of an empirical method. This method is based on the differential absorption properties of the water vapor in the 11-13 micron window region and does not require a detailed knowledge of the water vapor profile. With this approach Nimbus 4 IRIS radiance measurements are inverted to obtain temperature profiles. These calculated profiles agree with radiosonde data within about 2 C.
Exploring neighborhoods in the metagenome universe.
Aßhauer, Kathrin P; Klingenberg, Heiner; Lingner, Thomas; Meinicke, Peter
2014-07-14
The variety of metagenomes in current databases provides a rapidly growing source of information for comparative studies. However, the quantity and quality of supplementary metadata is still lagging behind. It is therefore important to be able to identify related metagenomes by means of the available sequence data alone. We have studied efficient sequence-based methods for large-scale identification of similar metagenomes within a database retrieval context. In a broad comparison of different profiling methods we found that vector-based distance measures are well-suitable for the detection of metagenomic neighbors. Our evaluation on more than 1700 publicly available metagenomes indicates that for a query metagenome from a particular habitat on average nine out of ten nearest neighbors represent the same habitat category independent of the utilized profiling method or distance measure. While for well-defined labels a neighborhood accuracy of 100% can be achieved, in general the neighbor detection is severely affected by a natural overlap of manually annotated categories. In addition, we present results of a novel visualization method that is able to reflect the similarity of metagenomes in a 2D scatter plot. The visualization method shows a similarly high accuracy in the reduced space as compared with the high-dimensional profile space. Our study suggests that for inspection of metagenome neighborhoods the profiling methods and distance measures can be chosen to provide a convenient interpretation of results in terms of the underlying features. Furthermore, supplementary metadata of metagenome samples in the future needs to comply with readily available ontologies for fine-grained and standardized annotation. To make profile-based k-nearest-neighbor search and the 2D-visualization of the metagenome universe available to the research community, we included the proposed methods in our CoMet-Universe server for comparative metagenome analysis.
Exploring Neighborhoods in the Metagenome Universe
Aßhauer, Kathrin P.; Klingenberg, Heiner; Lingner, Thomas; Meinicke, Peter
2014-01-01
The variety of metagenomes in current databases provides a rapidly growing source of information for comparative studies. However, the quantity and quality of supplementary metadata is still lagging behind. It is therefore important to be able to identify related metagenomes by means of the available sequence data alone. We have studied efficient sequence-based methods for large-scale identification of similar metagenomes within a database retrieval context. In a broad comparison of different profiling methods we found that vector-based distance measures are well-suitable for the detection of metagenomic neighbors. Our evaluation on more than 1700 publicly available metagenomes indicates that for a query metagenome from a particular habitat on average nine out of ten nearest neighbors represent the same habitat category independent of the utilized profiling method or distance measure. While for well-defined labels a neighborhood accuracy of 100% can be achieved, in general the neighbor detection is severely affected by a natural overlap of manually annotated categories. In addition, we present results of a novel visualization method that is able to reflect the similarity of metagenomes in a 2D scatter plot. The visualization method shows a similarly high accuracy in the reduced space as compared with the high-dimensional profile space. Our study suggests that for inspection of metagenome neighborhoods the profiling methods and distance measures can be chosen to provide a convenient interpretation of results in terms of the underlying features. Furthermore, supplementary metadata of metagenome samples in the future needs to comply with readily available ontologies for fine-grained and standardized annotation. To make profile-based k-nearest-neighbor search and the 2D-visualization of the metagenome universe available to the research community, we included the proposed methods in our CoMet-Universe server for comparative metagenome analysis. PMID:25026170
Efficient data assimilation algorithm for bathymetry application
NASA Astrophysics Data System (ADS)
Ghorbanidehno, H.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.
2017-12-01
Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing techniques. Data assimilation methods combine the remote sensing data and nearshore hydrodynamic models to estimate the unknown bathymetry and the corresponding uncertainties. In particular, several recent efforts have combined Kalman Filter-based techniques such as ensembled-based Kalman filters with indirect video-based observations to address the bathymetry inversion problem. However, these methods often suffer from ensemble collapse and uncertainty underestimation. Here, the Compressed State Kalman Filter (CSKF) method is used to estimate the bathymetry based on observed wave celerity. In order to demonstrate the accuracy and robustness of the CSKF method, we consider twin tests with synthetic observations of wave celerity, while the bathymetry profiles are chosen based on surveys taken by the U.S. Army Corps of Engineer Field Research Facility (FRF) in Duck, NC. The first test case is a bathymetry estimation problem for a spatially smooth and temporally constant bathymetry profile. The second test case is a bathymetry estimation problem for a temporally evolving bathymetry from a smooth to a non-smooth profile. For both problems, we compare the results of CSKF with those obtained by the local ensemble transform Kalman filter (LETKF), which is a popular ensemble-based Kalman filter method.
NASA Technical Reports Server (NTRS)
Padavala, Satyasrinivas; Palazzolo, Alan B.; Vallely, Pat; Ryan, Steve
1994-01-01
An improved dynamic analysis for liquid annular seals with arbitrary profile based on a method, first proposed by Nelson and Nguyen, is presented. An improved first order solution that incorporates a continuous interpolation of perturbed quantities in the circumferential direction, is presented. The original method uses an approximation scheme for circumferential gradients, based on Fast Fourier Transforms (FFT). A simpler scheme based on cubic splines is found to be computationally more efficient with better convergence at higher eccentricities. A new approach of computing dynamic coefficients based on external specified load is introduced. This improved analysis is extended to account for arbitrarily varying seal profile in both axial and circumferential directions. An example case of an elliptical seal with varying degrees of axial curvature is analyzed. A case study based on actual operating clearances of an interstage seal of the Space Shuttle Main Engine High Pressure Oxygen Turbopump is presented.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Christopher, David; Adams, Wallace P; Lee, Douglas S; Morgan, Beth; Pan, Ziqing; Singh, Gur Jai Pal; Tsong, Yi; Lyapustina, Svetlana
2007-01-19
The purpose of this article is to present the thought process, methods, and interim results of a PQRI Working Group, which was charged with evaluating the chi-square ratio test as a potential method for determining in vitro equivalence of aerodynamic particle size distribution (APSD) profiles obtained from cascade impactor measurements. Because this test was designed with the intention of being used as a tool in regulatory review of drug applications, the capability of the test to detect differences in APSD profiles correctly and consistently was evaluated in a systematic way across a designed space of possible profiles. To establish a "base line," properties of the test in the simplest case of pairs of identical profiles were studied. Next, the test's performance was studied with pairs of profiles, where some difference was simulated in a systematic way on a single deposition site using realistic product profiles. The results obtained in these studies, which are presented in detail here, suggest that the chi-square ratio test in itself is not sufficient to determine equivalence of particle size distributions. This article, therefore, introduces the proposal to combine the chi-square ratio test with a test for impactor-sized mass based on Population Bioequivalence and describes methods for evaluating discrimination capabilities of the combined test. The approaches and results described in this article elucidate some of the capabilities and limitations of the original chi-square ratio test and provide rationale for development of additional tests capable of comparing APSD profiles of pharmaceutical aerosols.
Fractal Clustering and Knowledge-driven Validation Assessment for Gene Expression Profiling.
Wang, Lu-Yong; Balasubramanian, Ammaiappan; Chakraborty, Amit; Comaniciu, Dorin
2005-01-01
DNA microarray experiments generate a substantial amount of information about the global gene expression. Gene expression profiles can be represented as points in multi-dimensional space. It is essential to identify relevant groups of genes in biomedical research. Clustering is helpful in pattern recognition in gene expression profiles. A number of clustering techniques have been introduced. However, these traditional methods mainly utilize shape-based assumption or some distance metric to cluster the points in multi-dimension linear Euclidean space. Their results shows poor consistence with the functional annotation of genes in previous validation study. From a novel different perspective, we propose fractal clustering method to cluster genes using intrinsic (fractal) dimension from modern geometry. This method clusters points in such a way that points in the same clusters are more self-affine among themselves than to the points in other clusters. We assess this method using annotation-based validation assessment for gene clusters. It shows that this method is superior in identifying functional related gene groups than other traditional methods.
DNA melting profiles from a matrix method.
Poland, Douglas
2004-02-05
In this article we give a new method for the calculation of DNA melting profiles. Based on the matrix formulation of the DNA partition function, the method relies for its efficiency on the fact that the required matrices are very sparse, essentially reducing matrix multiplication to vector multiplication and thus making the computer time required to treat a DNA molecule containing N base pairs proportional to N(2). A key ingredient in the method is the result that multiplication by the inverse matrix can also be reduced to vector multiplication. The task of calculating the melting profile for the entire genome is further reduced by treating regions of the molecule between helix-plateaus, thus breaking the molecule up into independent parts that can each be treated individually. The method is easily modified to incorporate changes in the assignment of statistical weights to the different structural features of DNA. We illustrate the method using the genome of Haemophilus influenzae. Copyright 2003 Wiley Periodicals, Inc.
A taxonomy of adolescent health: development of the adolescent health profile-types.
Riley, A W; Green, B F; Forrest, C B; Starfield, B; Kang, M; Ensminger, M E
1998-08-01
The aim of this study was to develop a taxonomy of health profile-types that describe adolescents' patterns of health as self-reported on a health status questionnaire. The intent was to be able to assign individuals to mutually exclusive and exhaustive groups that characterize the important aspects of their health and need for health services. Cluster analytic empirical methods and clinically based conceptual methods were used to identify patterns of health in samples of adolescents from schools and from clinics that serve adolescents with chronic conditions and acute illnesses. Individuals with similar patterns of scores across multiple domains were assigned to the same profile-type. Results from the empirical and conceptually based methods were integrated to produce a practical system for assigning youths to profile-types. Four domains of health (Satisfaction, Discomfort, Risks and Resilience) were used to group individuals into 13 distinct profile-types. The profile-types were characterized primarily by the number of domains in which health is poor, identifying the unique combinations of problems that characterize different subgroups of adolescents. This method of reporting the information available on health status surveys is potentially a more informative way of identifying and classifying the health needs of subgroups in the population than is available from global scores or multiple scale scores. The reliability and validity of this taxonomy of health profile-types for the purposes of planning and evaluating health services must be demonstrated. That is the purpose of the accompanying study.
Method for measuring radial impurity emission profiles using correlations of line integrated signals
NASA Astrophysics Data System (ADS)
Kuldkepp, M.; Brunsell, P. R.; Drake, J.; Menmuir, S.; Rachlew, E.
2006-04-01
A method of determining radial impurity emission profiles is outlined. The method uses correlations between line integrated signals and is based on the assumption of cylindrically symmetric fluctuations. Measurements at the reversed field pinch EXTRAP T2R show that emission from impurities expected to be close to the edge is clearly different in raw as well as analyzed data to impurities expected to be more central. Best fitting of experimental data to simulated correlation coefficients yields emission profiles that are remarkably close to emission profiles determined using more conventional techniques. The radial extension of the fluctuations is small enough for the method to be used and bandpass filtered signals indicate that fluctuations below 10kHz are cylindrically symmetric. The novel method is not sensitive to vessel window attenuation or wall reflections and can therefore complement the standard methods in the impurity emission reconstruction procedure.
Using distances between Top-n-gram and residue pairs for protein remote homology detection.
Liu, Bin; Xu, Jinghao; Zou, Quan; Xu, Ruifeng; Wang, Xiaolong; Chen, Qingcai
2014-01-01
Protein remote homology detection is one of the central problems in bioinformatics, which is important for both basic research and practical application. Currently, discriminative methods based on Support Vector Machines (SVMs) achieve the state-of-the-art performance. Exploring feature vectors incorporating the position information of amino acids or other protein building blocks is a key step to improve the performance of the SVM-based methods. Two new methods for protein remote homology detection were proposed, called SVM-DR and SVM-DT. SVM-DR is a sequence-based method, in which the feature vector representation for protein is based on the distances between residue pairs. SVM-DT is a profile-based method, which considers the distances between Top-n-gram pairs. Top-n-gram can be viewed as a profile-based building block of proteins, which is calculated from the frequency profiles. These two methods are position dependent approaches incorporating the sequence-order information of protein sequences. Various experiments were conducted on a benchmark dataset containing 54 families and 23 superfamilies. Experimental results showed that these two new methods are very promising. Compared with the position independent methods, the performance improvement is obvious. Furthermore, the proposed methods can also provide useful insights for studying the features of protein families. The better performance of the proposed methods demonstrates that the position dependant approaches are efficient for protein remote homology detection. Another advantage of our methods arises from the explicit feature space representation, which can be used to analyze the characteristic features of protein families. The source code of SVM-DT and SVM-DR is available at http://bioinformatics.hitsz.edu.cn/DistanceSVM/index.jsp.
New phenolic components and chromatographic profiles of green and fermented teas
USDA-ARS?s Scientific Manuscript database
A standardized profiling method based on liquid chromatography with diode array and electrospray ionization/mass spectrometric detection (LC-DAD-ESI/MS) was applied to establish the phenolic profiles of 41 green teas and 25 fermented teas. More than 80 phenolic compounds were either identified that ...
NASA Astrophysics Data System (ADS)
Kavungal, Vishnu; Farrell, Gerald; Wu, Qiang; Kumar Mallik, Arun; Semenova, Yuliya
2018-03-01
This paper experimentally demonstrates a method for geometrical profiling of asymmetries in fabricated thin microfiber tapers with waist diameters ranging from ∼10 to ∼50 μm with submicron accuracy. The method is based on the analysis of whispering gallery mode resonances excited in cylindrical fiber resonators as a result of evanescent coupling of light propagating through the fiber taper. The submicron accuracy of the proposed method has been verified by SEM studies. The method can be applied as a quality control tool in fabrication of microfiber based devices and sensors or for fine-tuning of microfiber fabrication set-ups.
Nehela, Yasser; Hijaz, Faraj; Elzaawely, Abdelnaser A; El-Zahaby, Hassan M; Killiny, Nabil
2016-07-20
Phytohormones mainly affect plant development and trigger varied responses to biotic and abiotic stresses. The sensitivity of methods used to profile phytohormones is a vital factor that affects the results. We used an improved GC-MS-based method in the selective ion-monitoring (SIM) mode to study the phytohormone profiling in citrus tissues. One extraction solvent mixture and two derivatization reagents were used, methyl chloroformate (MCF) and N-Methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA). The method showed a low limit of detection and low limit of quantification with high extraction recovery percentage and reproducibility. Overall, we detected 13 phytohormones belonging to six different groups. Auxins, SAs, tJA, and ABA were detected after derivatization with MCF while cytokinins and GAs were detected after derivatization with MSTFA. Cytokinins, SAs, and gibberellins were found in all tissues while auxins and tJA were observed only in the leaves. ABA was found in leaves and roots, but not in root tips. The method we used is efficient, precise, and appropriate to study citrus phytohormonal profiles to understand their crosstalk and responses to environmental and biological stresses. Copyright © 2016 Elsevier GmbH. All rights reserved.
Method and apparatus for an optical function generator for seamless tiled displays
NASA Technical Reports Server (NTRS)
Johnson, Michael (Inventor); Chen, Chung-Jen (Inventor)
2004-01-01
Producing seamless tiled images from multiple displays includes measuring a luminance profile of each of the displays, computing a desired luminance profile for each of the displays, and determining a spatial gradient profile of each of the displays based on the measured luminance profile and the computed desired luminance profile. The determined spatial gradient profile is applied to a spatial filter to be inserted into each of the displays to produce the seamless tiled display image.
PanFP: Pangenome-based functional profiles for microbial communities
Jun, Se -Ran; Hauser, Loren John; Schadt, Christopher Warren; ...
2015-09-26
For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost effective way to screen samples of interestmore » for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. As a result, we present a computational method called pangenome based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU s taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome s functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8 0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed reference OTU picking strategies against specific reference sequence databases. In conclusion, we developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub.« less
PanFP: pangenome-based functional profiles for microbial communities.
Jun, Se-Ran; Robeson, Michael S; Hauser, Loren J; Schadt, Christopher W; Gorin, Andrey A
2015-09-26
For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost-effective way to screen samples of interest for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. We present a computational method called pangenome-based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU's taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome's functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8-0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed-reference OTU picking strategies against specific reference sequence databases. We developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub ( https://github.com/srjun/PanFP ).
NASA Astrophysics Data System (ADS)
Chen, Min; Zhang, Yu
2017-04-01
A wind profiler network with a total of 65 profiling radars was operated by the MOC/CMA in China until July 2015. In this study, a quality control procedure is constructed to incorporate the profiler data from the wind-profiling network into the local data assimilation and forecasting system (BJRUC). The procedure applies a blacklisting check that removes stations with gross errors and an outlier check that rejects data with large deviations from the background. Instead of the bi-weighting method, which has been commonly implemented in outlier elimination for one-dimensional scalar observations, an outlier elimination method is developed based on the iterated reweighted minimum covariance determinant (IRMCD) for multi-variate observations such as wind profiler data. A quality control experiment is separately performed for subsets containing profiler data tagged in parallel with/without rain flags at every 00UTC/12UTC from 20 June to 30 Sep 2015. From the results, we find that with the quality control, the frequency distributions of the differences between the observations and model background become more Gaussian-like and meet the requirements of a Gaussian distribution for data assimilation. Further intensive assessment for each quality control step reveals that the stations rejected by blacklisting contain poor data quality, and the IRMCD rejects outliers in a robust and physically reasonable manner.
NASA Technical Reports Server (NTRS)
Green, Steven M.; Den Braven, Wim; Williams, David H.
1991-01-01
The profile negotiation process (PNP) concept as applied to the management of arrival traffic within the extended terminal area is presented, focusing on functional issues from the ground-based perspective. The PNP is an interactive process between an aircraft and air traffic control (ATC) which combines airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible. Preliminary results from a real-time simulation study show that the controller teams are able to consistently and effectively negotiate conflict-free vertical profiles with 4D-equipped aircraft. The ability of the airborne 4D flight management system to adapt to ATC specified 4D trajectory constraints is found to be a requirement for successful execution of the PNP. It is recommended that the conventional method of cost index iteration for obtaining the minimum fuel 4D trajectory be supplemented by a method which constrains the profile speeds to those desired by ATC.
Structure of Profiled Crystals Based on Solid Solutions of Bi2Te3 and Their X-Ray Diagnostics
NASA Astrophysics Data System (ADS)
Voronin, A. I.; Bublik, V. T.; Tabachkova, N. Yu.; Belov, Yu. M.
2011-05-01
In this work, we used x-ray structural diagnostic data to reveal the formation of structural regularities in profiled polycrystalline ingots based on Bi and Sb chalcogenide solid solutions. In Bi2Te3 lattice crystals, the solid phase grows such that the cleavage surfaces are perpendicular to the crystallization front. The crystallization singularity determines the nature of the growth texture. Because texture is an important factor determining the anisotropy of properties, which in turn determines the suitability of an ingot for production of modules and the possibility of figure of merit improvement, its diagnostics is an important issue for technology testing. Examples of texture analysis using the method of straight pole figure (SPF) construction for profiled crystals are provided. The structure of the surface layers in the profiled ingots was studied after electroerosion cutting. In addition, the method of estimation of the disturbed layer depth based on the nature of texture changes was used.
Kent, Angela D.; Smith, Dan J.; Benson, Barbara J.; Triplett, Eric W.
2003-01-01
Culture-independent DNA fingerprints are commonly used to assess the diversity of a microbial community. However, relating species composition to community profiles produced by community fingerprint methods is not straightforward. Terminal restriction fragment length polymorphism (T-RFLP) is a community fingerprint method in which phylogenetic assignments may be inferred from the terminal restriction fragment (T-RF) sizes through the use of web-based resources that predict T-RF sizes for known bacteria. The process quickly becomes computationally intensive due to the need to analyze profiles produced by multiple restriction digests and the complexity of profiles generated by natural microbial communities. A web-based tool is described here that rapidly generates phylogenetic assignments from submitted community T-RFLP profiles based on a database of fragments produced by known 16S rRNA gene sequences. Users have the option of submitting a customized database generated from unpublished sequences or from a gene other than the 16S rRNA gene. This phylogenetic assignment tool allows users to employ T-RFLP to simultaneously analyze microbial community diversity and species composition. An analysis of the variability of bacterial species composition throughout the water column in a humic lake was carried out to demonstrate the functionality of the phylogenetic assignment tool. This method was validated by comparing the results generated by this program with results from a 16S rRNA gene clone library. PMID:14602639
Callahan, Damien L; De Souza, David; Bacic, Antony; Roessner, Ute
2009-07-01
Highly polar metabolites, such as sugars and most amino acids are not retained by conventional RP LC columns. Without sufficient retention low concentration compounds are not detected due ion suppression and structural isomers are not resolved. In contrast, hydrophilic interaction chromatography (HILIC) and aqueous normal phase chromatography (ANP) retain compounds based on their hydrophilicity and therefore provides a means of separating highly polar compounds. Here, an ANP method based on the diamond hydride stationary phase is presented for profiling biological small molecules by LC. A rapid separation system based upon a fast gradient that delivers reproducible chromatography is presented. Approximately 1000 compounds were reproducibly detected in human urine samples and clear differences between these samples were identified. This chromatography was also applied to xylem fluid from soyabean (Glycine max) plants to which 400 compounds were detected. This method greatly increases the metabolite coverage over RP-only metabolite profiling in biological samples. We show that both forms of chromatography are necessary for untargeted comprehensive metabolite profiling and that the diamond hydride stationary phase provides a good option for polar metabolite analysis.
Evaluation of Rock Joint Coefficients
NASA Astrophysics Data System (ADS)
Audy, Ondřej; Ficker, Tomáš
2017-10-01
A computer method for evaluation of rock joint coefficients is described and several applications are presented. The method is based on two absolute numerical indicators that are formed by means of the Fourier replicas of rock joint profiles. The first indicator quantifies the vertical depth of profiles and the second indicator classifies wavy character of profiles. The absolute indicators have replaced the formerly used relative indicators that showed some artificial behavior in some cases. This contribution is focused on practical computations testing the functionality of the newly introduced indicators.
Pathogen profiling for disease management and surveillance.
Sintchenko, Vitali; Iredell, Jonathan R; Gilbert, Gwendolyn L
2007-06-01
The usefulness of rapid pathogen genotyping is widely recognized, but its effective interpretation and application requires integration into clinical and public health decision-making. How can pathogen genotyping data best be translated to inform disease management and surveillance? Pathogen profiling integrates microbial genomics data into communicable disease control by consolidating phenotypic identity-based methods with DNA microarrays, proteomics, metabolomics and sequence-based typing. Sharing data on pathogen profiles should facilitate our understanding of transmission patterns and the dynamics of epidemics.
Performance Assessment of Kernel Density Clustering for Gene Expression Profile Data
Zeng, Beiyan; Chen, Yiping P.; Smith, Oscar H.
2003-01-01
Kernel density smoothing techniques have been used in classification or supervised learning of gene expression profile (GEP) data, but their applications to clustering or unsupervised learning of those data have not been explored and assessed. Here we report a kernel density clustering method for analysing GEP data and compare its performance with the three most widely-used clustering methods: hierarchical clustering, K-means clustering, and multivariate mixture model-based clustering. Using several methods to measure agreement, between-cluster isolation, and withincluster coherence, such as the Adjusted Rand Index, the Pseudo F test, the r2 test, and the profile plot, we have assessed the effectiveness of kernel density clustering for recovering clusters, and its robustness against noise on clustering both simulated and real GEP data. Our results show that the kernel density clustering method has excellent performance in recovering clusters from simulated data and in grouping large real expression profile data sets into compact and well-isolated clusters, and that it is the most robust clustering method for analysing noisy expression profile data compared to the other three methods assessed. PMID:18629292
Research and development of LANDSAT-based crop inventory techniques
NASA Technical Reports Server (NTRS)
Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)
1982-01-01
A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.
Concurrent profiling of polar metabolites and lipids in human plasma using HILIC-FTMS
NASA Astrophysics Data System (ADS)
Cai, Xiaoming; Li, Ruibin
2016-11-01
Blood plasma is the most popularly used sample matrix for metabolite profiling studies, which aim to achieve global metabolite profiling and biomarker discovery. However, most of the current studies on plasma metabolite profiling focused on either the polar metabolites or lipids. In this study, a comprehensive analysis approach based on HILIC-FTMS was developed to concurrently examine polar metabolites and lipids. The HILIC-FTMS method was developed using mixed standards of polar metabolites and lipids, the separation efficiency of which is better in HILIC mode than in C5 and C18 reversed phase (RP) chromatography. This method exhibits good reproducibility in retention times (CVs < 3.43%) and high mass accuracy (<3.5 ppm). In addition, we found MeOH/ACN/Acetone (1:1:1, v/v/v) as extraction cocktail could achieve desirable gathering of demanded extracts from plasma samples. We further integrated the MeOH/ACN/Acetone extraction with the HILIC-FTMS method for metabolite profiling and smoking-related biomarker discovery in human plasma samples. Heavy smokers could be successfully distinguished from non smokers by univariate and multivariate statistical analysis of the profiling data, and 62 biomarkers for cigarette smoke were found. These results indicate that our concurrent analysis approach could be potentially used for clinical biomarker discovery, metabolite-based diagnosis, etc.
2013-01-01
Background Microbial ecologists often employ methods from classical community ecology to analyze microbial community diversity. However, these methods have limitations because microbial communities differ from macro-organismal communities in key ways. This study sought to quantify microbial diversity using methods that are better suited for data spanning multiple domains of life and dimensions of diversity. Diversity profiles are one novel, promising way to analyze microbial datasets. Diversity profiles encompass many other indices, provide effective numbers of diversity (mathematical generalizations of previous indices that better convey the magnitude of differences in diversity), and can incorporate taxa similarity information. To explore whether these profiles change interpretations of microbial datasets, diversity profiles were calculated for four microbial datasets from different environments spanning all domains of life as well as viruses. Both similarity-based profiles that incorporated phylogenetic relatedness and naïve (not similarity-based) profiles were calculated. Simulated datasets were used to examine the robustness of diversity profiles to varying phylogenetic topology and community composition. Results Diversity profiles provided insights into microbial datasets that were not detectable with classical univariate diversity metrics. For all datasets analyzed, there were key distinctions between calculations that incorporated phylogenetic diversity as a measure of taxa similarity and naïve calculations. The profiles also provided information about the effects of rare species on diversity calculations. Additionally, diversity profiles were used to examine thousands of simulated microbial communities, showing that similarity-based and naïve diversity profiles only agreed approximately 50% of the time in their classification of which sample was most diverse. This is a strong argument for incorporating similarity information and calculating diversity with a range of emphases on rare and abundant species when quantifying microbial community diversity. Conclusions For many datasets, diversity profiles provided a different view of microbial community diversity compared to analyses that did not take into account taxa similarity information, effective diversity, or multiple diversity metrics. These findings are a valuable contribution to data analysis methodology in microbial ecology. PMID:24238386
Method of LSD profile asymmetry for estimating the center of mass velocities of pulsating stars
NASA Astrophysics Data System (ADS)
Britavskiy, Nikolay; Pancino, Elena; Romano, Donatella; Tsymbal, Vadim
2015-08-01
We present radial velocity analysis for 20 solar neighborhood RR Lyrae and 3 Population II Cepheids. High-resolution spectra were observed with either TNG/SARG or VLT/UVES over varying phases. To estimate the center of mass (barycentric) velocities of the program stars, we utilized two independent methods. First, the 'classic' method was employed, which is based on RR Lyrae radial velocity curve templates. Second, we provide the new method that used absorption line profile asymmetry to determine both the pulsation and the barycentric velocities even with a low number of high-resolution spectra and in cases where the phase of the observations is uncertain. This new method is based on a Least Squares Deconvolution (LSD) of the line profiles in order to analyze line asymmetry that occurs in the spectra of pulsating stars. By applying this method to our sample stars we attain accurate measurements (± 1 km/s) of the pulsation component of the radial velocity. This results in determination of the barycentric velocity to within 5 km/s even with a low number of high-resolution spectra. A detailed investigation of LSD profile asymmetry shows the variable nature of the project factor at different pulsation phases, which should be taken into account in the detailed spectroscopic analysis of pulsating stars.
Method of LSD profile asymmetry for estimating the center of mass velocities of pulsating stars
NASA Astrophysics Data System (ADS)
Britavskiy, N.; Pancino, E.; Tsymbal, V.; Romano, D.; Cacciari, C.; Clementini, C.
2016-05-01
We present radial velocity analysis for 20 solar neighborhood RR Lyrae and 3 Population II Cepheids. High-resolution spectra were observed with either TNG/SARG or VLT/UVES over varying phases. To estimate the center of mass (barycentric) velocities of the program stars, we utilized two independent methods. First, the 'classic' method was employed, which is based on RR Lyrae radial velocity curve templates. Second, we provide the new method that used absorption line profile asymmetry to determine both the pulsation and the barycentric velocities even with a low number of high-resolution spectra and in cases where the phase of the observations is uncertain. This new method is based on a least squares deconvolution (LSD) of the line profiles in order to an- alyze line asymmetry that occurs in the spectra of pulsating stars. By applying this method to our sample stars we attain accurate measurements (+- 2 kms^-1) of the pulsation component of the radial velocity. This results in determination of the barycentric velocity to within 5 kms^-1 even with a low number of high- resolution spectra. A detailed investigation of LSD profile asymmetry shows the variable nature of the project factor at different pulsation phases, which should be taken into account in the detailed spectroscopic analysis of pulsating stars.
Caumes, Géraldine; Borrel, Alexandre; Abi Hussein, Hiba; Camproux, Anne-Claude; Regad, Leslie
2017-09-01
Small molecules interact with their protein target on surface cavities known as binding pockets. Pocket-based approaches are very useful in all of the phases of drug design. Their first step is estimating the binding pocket based on protein structure. The available pocket-estimation methods produce different pockets for the same target. The aim of this work is to investigate the effects of different pocket-estimation methods on the results of pocket-based approaches. We focused on the effect of three pocket-estimation methods on a pocket-ligand (PL) classification. This pocket-based approach is useful for understanding the correspondence between the pocket and ligand spaces and to develop pharmacological profiling models. We found pocket-estimation methods yield different binding pockets in terms of boundaries and properties. These differences are responsible for the variation in the PL classification results that can have an impact on the detected correspondence between pocket and ligand profiles. Thus, we highlighted the importance of the pocket-estimation method choice in pocket-based approaches. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
System and method for knowledge based matching of users in a network
Verspoor, Cornelia Maria [Santa Fe, NM; Sims, Benjamin Hayden [Los Alamos, NM; Ambrosiano, John Joseph [Los Alamos, NM; Cleland, Timothy James [Los Alamos, NM
2011-04-26
A knowledge-based system and methods to matchmaking and social network extension are disclosed. The system is configured to allow users to specify knowledge profiles, which are collections of concepts that indicate a certain topic or area of interest selected from an. The system utilizes the knowledge model as the semantic space within which to compare similarities in user interests. The knowledge model is hierarchical so that indications of interest in specific concepts automatically imply interest in more general concept. Similarity measures between profiles may then be calculated based on suitable distance formulas within this space.
Reprogramming Methods Do Not Affect Gene Expression Profile of Human Induced Pluripotent Stem Cells.
Trevisan, Marta; Desole, Giovanna; Costanzi, Giulia; Lavezzo, Enrico; Palù, Giorgio; Barzon, Luisa
2017-01-20
Induced pluripotent stem cells (iPSCs) are pluripotent cells derived from adult somatic cells. After the pioneering work by Yamanaka, who first generated iPSCs by retroviral transduction of four reprogramming factors, several alternative methods to obtain iPSCs have been developed in order to increase the yield and safety of the process. However, the question remains open on whether the different reprogramming methods can influence the pluripotency features of the derived lines. In this study, three different strategies, based on retroviral vectors, episomal vectors, and Sendai virus vectors, were applied to derive iPSCs from human fibroblasts. The reprogramming efficiency of the methods based on episomal and Sendai virus vectors was higher than that of the retroviral vector-based approach. All human iPSC clones derived with the different methods showed the typical features of pluripotent stem cells, including the expression of alkaline phosphatase and stemness maker genes, and could give rise to the three germ layer derivatives upon embryoid bodies assay. Microarray analysis confirmed the presence of typical stem cell gene expression profiles in all iPSC clones and did not identify any significant difference among reprogramming methods. In conclusion, the use of different reprogramming methods is equivalent and does not affect gene expression profile of the derived human iPSCs.
Liu, Jing; Bredie, Wender L P; Sherman, Emma; Harbertson, James F; Heymann, Hildegarde
2018-04-01
Rapid sensory methods have been developed as alternatives to traditional sensory descriptive analysis methods. Among them, Free-Choice Profiling (FCP) and Flash Profile (FP) are two that have been known for many years. The objectives of this work were to compare the rating-based FCP and ranking-based FP method; to evaluate the impact of adding adjustments to FP approach; to investigate the influence of the number of assessors on the outcome of modified FP. To achieve these aims, a conventional descriptive analysis (DA), FCP, FP and a modified version of FP were carried out. Red wines made by different grape maturity and ethanol concentration were used for sensory testing. This study showed that DA provided a more detailed and accurate information on products through a quantitative measure of the intensity of sensory attributes than FCP and FP. However, the panel hours for conducting DA were higher than that for rapid methods, and FP was even able to separate the samples to a higher degree than DA. When comparing FCP and FP, this study showed that the ranking-based FP provided a clearer separation of samples than rating-based FCP, but the latter was an easier task for most assessors. When restricting assessors on their use of attributes in FP, the sample space became clearer and the ranking task was simplified. The FP protocol with restricted attribute sets seems to be a promising approach for efficient screening of sensory properties in wine. When increasing the number of assessors from 10 to 20 for conducting the modified FP, the outcome tended to be slightly more stable, however, one should consider the degree of panel training when deciding the optimal number of assessors for conducting FP. Copyright © 2018 Elsevier Ltd. All rights reserved.
Raman lidar water vapor profiling over Warsaw, Poland
NASA Astrophysics Data System (ADS)
Stachlewska, Iwona S.; Costa-Surós, Montserrat; Althausen, Dietrich
2017-09-01
Water vapor mixing ratio and relative humidity profiles were derived from the multi-wavelength Raman PollyXT lidar at the EARLINET site in Warsaw, using the Rayleigh molecular extinction calculation based on atmospheric temperature and pressure from three different sources: i) the standard atmosphere US 62, ii) the Global Data Assimilation System (GDAS) model output, and iii) the WMO 12374 radiosoundings launched at Legionowo. With each method, 136 midnight relative humidity profiles were obtained for lidar observations from July 2013 to August 2015. Comparisons of these profiles showed in favor of the latter method (iii), but it also indicated that the other two data sources could replace it, if necessary. Such use was demonstrated for an automated retrieval of water vapor mixing ratio from dusk until dawn on 19/20 March 2015; a case study related to an advection of biomass burning aerosol from forest fires over Ukraine. Additionally, an algorithm that applies thresholds to the radiosounding relative humidity profiles to estimate macro-physical cloud vertical structure was used for the first time on the Raman lidar relative humidity profiles. The results, based on a subset of 66 profiles, indicate that below 6 km cloud bases/tops can be successfully obtained in 53% and 76% cases from lidar and radiosounding profiles, respectively. Finally, a contribution of the lidar derived mean relative humidity to cloudy conditions within the range of 0.8 to 6.2 km, in comparison to clear-sky conditions, was estimated.
Assigning protein functions by comparative genome analysis protein phylogenetic profiles
Pellegrini, Matteo; Marcotte, Edward M.; Thompson, Michael J.; Eisenberg, David; Grothe, Robert; Yeates, Todd O.
2003-05-13
A computational method system, and computer program are provided for inferring functional links from genome sequences. One method is based on the observation that some pairs of proteins A' and B' have homologs in another organism fused into a single protein chain AB. A trans-genome comparison of sequences can reveal these AB sequences, which are Rosetta Stone sequences because they decipher an interaction between A' and B. Another method compares the genomic sequence of two or more organisms to create a phylogenetic profile for each protein indicating its presence or absence across all the genomes. The profile provides information regarding functional links between different families of proteins. In yet another method a combination of the above two methods is used to predict functional links.
A proof of the DBRF-MEGN method, an algorithm for deducing minimum equivalent gene networks
2011-01-01
Background We previously developed the DBRF-MEGN (difference-based regulation finding-minimum equivalent gene network) method, which deduces the most parsimonious signed directed graphs (SDGs) consistent with expression profiles of single-gene deletion mutants. However, until the present study, we have not presented the details of the method's algorithm or a proof of the algorithm. Results We describe in detail the algorithm of the DBRF-MEGN method and prove that the algorithm deduces all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. Conclusions The DBRF-MEGN method provides all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. PMID:21699737
DOE Office of Scientific and Technical Information (OSTI.GOV)
Won, Yoo Jai; Ki, Hyungson
A novel picosecond-laser pulsed laser deposition method has been developed for fabricating functionally graded films with pre-designed gradient profiles. Theoretically, the developed method is capable of precisely fabricating films with any thicknesses and any gradient profiles by controlling the laser beam powers for the two different targets based on the film composition profiles. As an implementation example, we have successfully constructed functionally graded diamond-like carbon films with six different gradient profiles: linear, quadratic, cubic, square root, cubic root, and sinusoidal. Energy dispersive X-ray spectroscopy is employed for investigating the chemical composition along the thickness of the film, and the depositionmore » profile and thickness errors are found to be less than 3% and 1.04%, respectively. To the best of the authors' knowledge, this is the first method for fabricating films with designed gradient profiles and has huge potential in many areas of coatings and films, including multifunctional optical films. We believe that this method is not only limited to the example considered in this study, but also can be applied to all material combinations as long as they can be deposited using the pulsed laser deposition technique.« less
[The application of spectral geological profile in the alteration mapping].
Li, Qing-Ting; Lin, Qi-Zhong; Zhang, Bing; Lu, Lin-Lin
2012-07-01
Geological section can help validating and understanding of the alteration information which is extracted from remote sensing images. In the paper, the concept of spectral geological profile was introduced based on the principle of geological section and the method of spectral information extraction. The spectral profile can realize the storage and vision of spectra along the geological profile, but the spectral geological spectral profile includes more information besides the information of spectral profile. The main object of spectral geological spectral profile is to obtain the distribution of alteration types and content of minerals along the profile which can be extracted from spectra measured by field spectrometer, especially for the spatial distribution and mode of alteration association. Technical method and work flow of alteration information extraction was studied for the spectral geological profile. The spectral geological profile was set up using the ground reflectance spectra and the alteration information was extracted from the remote sensing image with the help of typical spectra geological profile. At last the meaning and effect of the spectral geological profile was discussed.
A New Maximum Likelihood Approach for Free Energy Profile Construction from Molecular Simulations
Lee, Tai-Sung; Radak, Brian K.; Pabis, Anna; York, Darrin M.
2013-01-01
A novel variational method for construction of free energy profiles from molecular simulation data is presented. The variational free energy profile (VFEP) method uses the maximum likelihood principle applied to the global free energy profile based on the entire set of simulation data (e.g from multiple biased simulations) that spans the free energy surface. The new method addresses common obstacles in two major problems usually observed in traditional methods for estimating free energy surfaces: the need for overlap in the re-weighting procedure and the problem of data representation. Test cases demonstrate that VFEP outperforms other methods in terms of the amount and sparsity of the data needed to construct the overall free energy profiles. For typical chemical reactions, only ~5 windows and ~20-35 independent data points per window are sufficient to obtain an overall qualitatively correct free energy profile with sampling errors an order of magnitude smaller than the free energy barrier. The proposed approach thus provides a feasible mechanism to quickly construct the global free energy profile and identify free energy barriers and basins in free energy simulations via a robust, variational procedure that determines an analytic representation of the free energy profile without the requirement of numerically unstable histograms or binning procedures. It can serve as a new framework for biased simulations and is suitable to be used together with other methods to tackle with the free energy estimation problem. PMID:23457427
On the construction of a skill-based wheelchair navigation profile.
Urdiales, Cristina; Pérez, Eduardo Javier; Peinado, Gloria; Fdez-Carmona, Manuel; Peula, Jose M; Annicchiarico, Roberta; Sandoval, Francisco; Caltagirone, Carlo
2013-11-01
Assisted wheelchair navigation is of key importance for persons with severe disabilities. The problem has been solved in different ways, usually based on the shared control paradigm. This paradigm consists of giving the user more or less control on a need basis. Naturally, these approaches require personalization: each wheelchair user has different skills and needs and it is hard to know a priori from diagnosis how much assistance must be provided. Furthermore, since there is no such thing as an average user, sometimes it is difficult to quantify the benefits of these systems. This paper proposes a new method to extract a prototype user profile using real traces based on more than 70 volunteers presenting different physical and cognitive skills. These traces are clustered to determine the average behavior that can be expected from a wheelchair user in order to cope with significant situations. Processed traces provide a prototype user model for comparison purposes, plus a simple method to obtain without supervision a skill-based navigation profile for any user while he/she is driving. This profile is useful for benchmarking but also to determine the situations in which a given user might require more assistance after evaluating how well he/she compares to the benchmark. Profile-based shared control has been successfully tested by 18 volunteers affected by left or right brain stroke at Fondazione Santa Lucia, in Rome, Italy.
Pires, Nuno M M; Tao Dong; Berntzen, Lasse; Lonningdal, Torill
2017-07-01
This work focuses on the development of a sophisticated technique via STR typing to unequivocally verify the authenticity of urine samples before sent to laboratories. STR profiling was conducted with the CSF1PO, TPOX, TH01 Multiplex System coupled with a smartphone-based detection method. The promising capability of the method to identify distinct STR profiles from urine of different persons opens the possibility to conduct sample authenticity tests. On-site STR profiling could be realized with a self-contained autonomous device with an integrated PCR microchip shown hereby.
Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong
2013-10-01
A processor-implemented method for determining aging of a processing unit in a processor the method comprising: calculating an effective aging profile for the processing unit wherein the effective aging profile quantifies the effects of aging on the processing unit; combining the effective aging profile with process variation data, actual workload data and operating conditions data for the processing unit; and determining aging through an aging sensor of the processing unit using the effective aging profile, the process variation data, the actual workload data, architectural characteristics and redundancy data, and the operating conditions data for the processing unit.
Logo recognition in video by line profile classification
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Hanjalic, Alan
2003-12-01
We present an extension to earlier work on recognizing logos in video stills. The logo instances considered here are rigid planar objects observed at a distance in the scene, so the possible perspective transformation can be approximated by an affine transformation. For this reason we can classify the logos by matching (invariant) line profiles. We enhance our previous method by considering multiple line profiles instead of a single profile of the logo. The positions of the lines are based on maxima in the Hough transform space of the segmented logo foreground image. Experiments are performed on MPEG1 sport video sequences to show the performance of the proposed method.
Wireless autonomous device data transmission
NASA Technical Reports Server (NTRS)
Sammel, Jr., David W. (Inventor); Mickle, Marlin H. (Inventor); Cain, James T. (Inventor); Mi, Minhong (Inventor)
2013-01-01
A method of communicating information from a wireless autonomous device (WAD) to a base station. The WAD has a data element having a predetermined profile having a total number of sequenced possible data element combinations. The method includes receiving at the WAD an RF profile transmitted by the base station that includes a triggering portion having a number of pulses, wherein the number is at least equal to the total number of possible data element combinations. The method further includes keeping a count of received pulses and wirelessly transmitting a piece of data, preferably one bit, to the base station when the count reaches a value equal to the stored data element's particular number in the sequence. Finally, the method includes receiving the piece of data at the base station and using the receipt thereof to determine which of the possible data element combinations the stored data element is.
Data Transfer Advisor with Transport Profiling Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Yun, Daqing
The network infrastructures have been rapidly upgraded in many high-performance networks (HPNs). However, such infrastructure investment has not led to corresponding performance improvement in big data transfer, especially at the application layer, largely due to the complexity of optimizing transport control on end hosts. We design and implement ProbData, a PRofiling Optimization Based DAta Transfer Advisor, to help users determine the most effective data transfer method with the most appropriate control parameter values to achieve the best data transfer performance. ProbData employs a profiling optimization based approach to exploit the optimal operational zone of various data transfer methods in supportmore » of big data transfer in extreme scale scientific applications. We present a theoretical framework of the optimized profiling approach employed in ProbData as wellas its detailed design and implementation. The advising procedure and performance benefits of ProbData are illustrated and evaluated by proof-of-concept experiments in real-life networks.« less
Web-Based Analysis for Student-Generated Complex Genetic Profiles
ERIC Educational Resources Information Center
Kass, David H.; LaRoe, Robert
2007-01-01
A simple, rapid method for generating complex genetic profiles using Alu-based markers was recently developed for students primarily at the undergraduate level to learn more about forensics and paternity analysis. On the basis of the Cold Spring Harbor Allele Server, which provides an excellent tool for analyzing a single Alu variant, we present a…
NASA Astrophysics Data System (ADS)
Costa Surós, Montserrat; Stachlewska, Iwona S.
2016-04-01
A long-term study, assessing ground-based remote Raman lidar versus in-situ radiosounding has been conducted with the aim of improving the knowledge on the water content vertical profile through the atmosphere, and thus the conditions for cloud formation processes. Water vapor mixing ratio (WVMR) and relative humidity (RH) profiles were retrieved from ADR Lidar (PollyXT-type, EARLINET site in Warsaw). So far, more than 100 nighttime profiles averaged over 1h around midnight from July 2013 to December 2015 have been investigated. Data were evaluated with molecular extinctions calculated using two approximations: the US62 standard atmosphere and the radiosounding launched in Legionowo (12374). The calibration factor CH2O for lidar retrievals was obtained for each profile using the regression method and the profile method to determine the best calibration factor approximation to be used in the final WVMR and RH calculation. Thus, statistically representative results for comparisons between lidar WVMR median profiles obtained by calibrating using radiosounding profiles and using atmospheric synthetic profiles, all of them with the best calibration factor, will be presented. Finally, in order to constrain the conditions of cloud formation in function of the RH profile, the COS14 algorithm, capable of deriving cloud bases and tops by applying thresholds to the RH profiles, was applied to find the cloud vertical structure (CVS). The algorithm was former applied to radiosounding profiles at SGP-ARM site and tested against the CVS obtained from the Active Remote Sensing of Clouds (ARSCL) data. Similarly, it was applied for lidar measurements at the Warsaw measurement site.
ERIC Educational Resources Information Center
Costa, Rochelle Rocha; Pilla, Carmen; Buttelli, Adriana Cristine Koch; Barreto, Michelle Flores; Vieiro, Priscila Azevedo; Alberton, Cristine Lima; Bracht, Cláudia Gomes; Kruel, Luiz Fernando Martins
2018-01-01
Purpose: This study aimed to investigate the effects of water-based aerobic training on the lipid profile and lipoprotein lipase (LPL) levels in premenopausal women with dyslipidemia. Method: Forty women were randomly assigned to: aquatic training (WA; n = 20) or a control group (CG; n = 20). The WA group underwent 12 weeks of water-based interval…
Metabolic Pathway Assignment of Plant Genes based on Phylogenetic Profiling–A Feasibility Study
Weißenborn, Sandra; Walther, Dirk
2017-01-01
Despite many developed experimental and computational approaches, functional gene annotation remains challenging. With the rapidly growing number of sequenced genomes, the concept of phylogenetic profiling, which predicts functional links between genes that share a common co-occurrence pattern across different genomes, has gained renewed attention as it promises to annotate gene functions based on presence/absence calls alone. We applied phylogenetic profiling to the problem of metabolic pathway assignments of plant genes with a particular focus on secondary metabolism pathways. We determined phylogenetic profiles for 40,960 metabolic pathway enzyme genes with assigned EC numbers from 24 plant species based on sequence and pathway annotation data from KEGG and Ensembl Plants. For gene sequence family assignments, needed to determine the presence or absence of particular gene functions in the given plant species, we included data of all 39 species available at the Ensembl Plants database and established gene families based on pairwise sequence identities and annotation information. Aside from performing profiling comparisons, we used machine learning approaches to predict pathway associations from phylogenetic profiles alone. Selected metabolic pathways were indeed found to be composed of gene families of greater than expected phylogenetic profile similarity. This was particularly evident for primary metabolism pathways, whereas for secondary pathways, both the available annotation in different species as well as the abstraction of functional association via distinct pathways proved limiting. While phylogenetic profile similarity was generally not found to correlate with gene co-expression, direct physical interactions of proteins were reflected by a significantly increased profile similarity suggesting an application of phylogenetic profiling methods as a filtering step in the identification of protein-protein interactions. This feasibility study highlights the potential and challenges associated with phylogenetic profiling methods for the detection of functional relationships between genes as well as the need to enlarge the set of plant genes with proven secondary metabolism involvement as well as the limitations of distinct pathways as abstractions of relationships between genes. PMID:29163570
The Galactic Isotropic γ-ray Background and Implications for Dark Matter
NASA Astrophysics Data System (ADS)
Campbell, Sheldon S.; Kwa, Anna; Kaplinghat, Manoj
2018-06-01
We present an analysis of the radial angular profile of the galacto-isotropic (GI) γ-ray flux-the statistically uniform flux in angular annuli centred on the Galactic centre. Two different approaches are used to measure the GI flux profile in 85 months of Fermi-LAT data: the BDS statistical method which identifies spatial correlations, and a new Poisson ordered-pixel method which identifies non-Poisson contributions. Both methods produce similar GI flux profiles. The GI flux profile is well-described by an existing model of bremsstrahlung, π0 production, inverse Compton scattering, and the isotropic background. Discrepancies with data in our full-sky model are not present in the GI component, and are therefore due to mis-modelling of the non-GI emission. Dark matter annihilation constraints based solely on the observed GI profile are close to the thermal WIMP cross section below 100 GeV, for fixed models of the dark matter density profile and astrophysical γ-ray foregrounds. Refined measurements of the GI profile are expected to improve these constraints by a factor of a few.
A flexible motif search technique based on generalized profiles.
Bucher, P; Karplus, K; Moeri, N; Hofmann, K
1996-03-01
A flexible motif search technique is presented which has two major components: (1) a generalized profile syntax serving as a motif definition language; and (2) a motif search method specifically adapted to the problem of finding multiple instances of a motif in the same sequence. The new profile structure, which is the core of the generalized profile syntax, combines the functions of a variety of motif descriptors implemented in other methods, including regular expression-like patterns, weight matrices, previously used profiles, and certain types of hidden Markov models (HMMs). The relationship between generalized profiles and other biomolecular motif descriptors is analyzed in detail, with special attention to HMMs. Generalized profiles are shown to be equivalent to a particular class of HMMs, and conversion procedures in both directions are given. The conversion procedures provide an interpretation for local alignment in the framework of stochastic models, allowing for clear, simple significance tests. A mathematical statement of the motif search problem defines the new method exactly without linking it to a specific algorithmic solution. Part of the definition includes a new definition of disjointness of alignments.
Application of Sub-Bottom Profiler to Study Riverbed Structure and Sediment Density
NASA Astrophysics Data System (ADS)
Rui, Wang; Changzheng, Li; Xiaofei, Yan
2018-03-01
In this pater, we present a study on the riverbed structure and sediment density in-situ test by using sub-bottom profiler. Compared with traditional direct observation methods, the sub-bottom profiler method based on sonar technology is non-contact, low-disturbance and high-efficient. We finish the investigation of several sections in Sanmenxia and Xiaolangdi reservoirs, which located on the main channel of lower reaches of Yellow River. Collected data show a detailed layered structure of the riverbed sediment which believed caused by sedimentary processes in different periods. Further more, we analyse the reflection coefficient of water-sediment interface and inverse the sediment density data from the raw wave record. The inversion method is based on the effective density fluid model and Kozeny-Carman formula. The comparison of the inversion results and sample tests shows that the in-situ test is reliable and useable.
Chang, Yuwei; Zhao, Chunxia; Wu, Zeming; Zhou, Jia; Zhao, Sumin; Lu, Xin; Xu, Guowang
2012-08-01
In this work a chip-based nano HPLC coupled MS (HPLC-chip/MS) method with a simple sample preparation procedure was developed for the flavonoid profiling of soybean. The analytical properties of the method including the linearity (R(2) , 0.992-0.995), reproducibility (RSD, 1.50-7.66%), intraday precision (RSD, 1.41-5.14%) and interday precision (RSD, 2.76-16.90%) were satisfactory. Compared with the conventional HPLC/MS method, a fast extraction and analysis procedure was applied and more flavonoids were detected in a single run. Additionally, 13 flavonoids in soybean seed were identified for the first time. The method was then applied to the profiling of six varieties of soybean sowed at the same place. A clear discrimination was observed among different cultivars, three isoflavones, accounting for nearly 80% of total flavonoid contents, were found increased in the spring soybeans compared with the summer cultivars. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rosenow, Matthew; Xiao, Nick; Spetzler, David
2018-01-01
ABSTRACT Extracellular vesicle (EV)-based liquid biopsies have been proposed to be a readily obtainable biological substrate recently for both profiling and diagnostics purposes. Development of a fast and reliable preparation protocol to enrich such small particles could accelerate the discovery of informative, disease-related biomarkers. Though multiple EV enrichment protocols are available, in terms of efficiency, reproducibility and simplicity, precipitation-based methods are most amenable to studies with large numbers of subjects. However, the selectivity of the precipitation becomes critical. Here, we present a simple plasma EV enrichment protocol based on pluronic block copolymer. The enriched plasma EV was able to be verified by multiple platforms. Our results showed that the particles enriched from plasma by the copolymer were EV size vesicles with membrane structure; proteomic profiling showed that EV-related proteins were significantly enriched, while high-abundant plasma proteins were significantly reduced in comparison to other precipitation-based enrichment methods. Next-generation sequencing confirmed the existence of various RNA species that have been observed in EVs from previous studies. Small RNA sequencing showed enriched species compared to the corresponding plasma. Moreover, plasma EVs enriched from 20 advanced breast cancer patients and 20 age-matched non-cancer controls were profiled by semi-quantitative mass spectrometry. Protein features were further screened by EV proteomic profiles generated from four breast cancer cell lines, and then selected in cross-validation models. A total of 60 protein features that highly contributed in model prediction were identified. Interestingly, a large portion of these features were associated with breast cancer aggression, metastasis as well as invasion, consistent with the advanced clinical stage of the patients. In summary, we have developed a plasma EV enrichment method with improved precipitation selectivity and it might be suitable for larger-scale discovery studies. PMID:29696079
López-Cortés, Rubén; Formigo, Jacobo; Reboiro-Jato, Miguel; Fdez-Riverola, Florentino; Blanco, Francisco J; Lodeiro, Carlos; Oliveira, Elisabete; Capelo, J L; Santos, H M
2016-04-01
The aim of this work is to develop a nanoparticle-based methodology to find out biomarkers of diagnostic for knee osteoarthritis, KOA, through the use of matrix assisted laser desorption ionization time-of-flight-based mass spectrometry profiling. Urine samples used for this study were obtained from KOA patients (42 patients), patients with prosthesis (58 patients), and controls (36 individuals) with no history of joint disease. Gold-nano particle MALDI-based urine profiling was optimized and then applied over the 136 individuals. Jaccard index and 10 different classifiers over MALDI MS datasets were used to find out potential biomarkers. Then, the specificity and sensitivity of the method were evaluated. The presence of ten m/z signals as potential biomarkers in the healthy versus non-healthy approach suggests that patients (KOA and prosthesis) are differentiable from the healthy volunteers through profiling. The automatic diagnostic study confirmed these preliminary conclusions. The sensitivity and the specificity for the urine profiling criteria here reported, achieved by the C4.5 classifier, is 97% and 69% respectively. Thus, it is confirmed the utility of the method proposed in this work as an additional fast, non-expensive and robust test for KOA diagnostic. When the proposed method is compared with those used in common practice it is found that sensitivity is the highest, thus with a low false negative rate for diagnostic KOA patients in the population studied. Specificity is lower but in the range accepted for diagnostic objectives. Copyright © 2016. Published by Elsevier B.V.
Global gray-level thresholding based on object size.
Ranefall, Petter; Wählby, Carolina
2016-04-01
In this article, we propose a fast and robust global gray-level thresholding method based on object size, where the selection of threshold level is based on recall and maximum precision with regard to objects within a given size interval. The method relies on the component tree representation, which can be computed in quasi-linear time. Feature-based segmentation is especially suitable for biomedical microscopy applications where objects often vary in number, but have limited variation in size. We show that for real images of cell nuclei and synthetic data sets mimicking fluorescent spots the proposed method is more robust than all standard global thresholding methods available for microscopy applications in ImageJ and CellProfiler. The proposed method, provided as ImageJ and CellProfiler plugins, is simple to use and the only required input is an interval of the expected object sizes. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
Yun, Sungdae; Kyriakos, Walid E; Chung, Jun-Young; Han, Yeji; Yoo, Seung-Schik; Park, Hyunwook
2007-03-01
To develop a novel approach for calculating the accurate sensitivity profiles of phased-array coils, resulting in correction of nonuniform intensity in parallel MRI. The proposed intensity-correction method estimates the accurate sensitivity profile of each channel of the phased-array coil. The sensitivity profile is estimated by fitting a nonlinear curve to every projection view through the imaged object. The nonlinear curve-fitting efficiently obtains the low-frequency sensitivity profile by eliminating the high-frequency image contents. Filtered back-projection (FBP) is then used to compute the estimates of the sensitivity profile of each channel. The method was applied to both phantom and brain images acquired from the phased-array coil. Intensity-corrected images from the proposed method had more uniform intensity than those obtained by the commonly used sum-of-squares (SOS) approach. With the use of the proposed correction method, the intensity variation was reduced to 6.1% from 13.1% of the SOS. When the proposed approach was applied to the computation of the sensitivity maps during sensitivity encoding (SENSE) reconstruction, it outperformed the SOS approach in terms of the reconstructed image uniformity. The proposed method is more effective at correcting the intensity nonuniformity of phased-array surface-coil images than the conventional SOS method. In addition, the method was shown to be resilient to noise and was successfully applied for image reconstruction in parallel imaging.
Fonseca, Fernando G A; Esmerino, Erick A; Filho, Elson R Tavares; Ferraz, Juliana P; da Cruz, Adriano G; Bolini, Helena M A
2016-05-01
Rapid sensory profiling methods have gained space in the sensory evaluation field. Techniques using direct analysis of the terms generated by consumers are considered easy to perform, without specific training requirements, thus improving knowledge about consumer perceptions on various products. This study aimed to determine the sensory profile of different commercial samples of chocolate ice cream, labeled as conventional and light or diet, using the "comment analysis" and "pivot profile" methods, based on consumers' perceptions. In the comment analysis task, consumers responded to 2 separate open questions describing the sensory attributes they liked or disliked in each sample. In the pivot profile method, samples were served in pairs (consisting of a coded sample and pivot), and consumers indicated the higher and lower intensity attributes in the target sample compared with the pivot. We observed that both methods were able to characterize the different chocolate ice cream samples using consumer perception, with high correlation results and configurational similarity (regression vector coefficient=0.917) between them. However, it is worth emphasizing that comment analysis is performed intuitively by consumers, whereas the pivot profile method showed high analytical and discriminative power even using consumers, proving to be a promising technique for routine application when classical descriptive methods cannot be used. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
PrePhyloPro: phylogenetic profile-based prediction of whole proteome linkages
Niu, Yulong; Liu, Chengcheng; Moghimyfiroozabad, Shayan; Yang, Yi
2017-01-01
Direct and indirect functional links between proteins as well as their interactions as part of larger protein complexes or common signaling pathways may be predicted by analyzing the correlation of their evolutionary patterns. Based on phylogenetic profiling, here we present a highly scalable and time-efficient computational framework for predicting linkages within the whole human proteome. We have validated this method through analysis of 3,697 human pathways and molecular complexes and a comparison of our results with the prediction outcomes of previously published co-occurrency model-based and normalization methods. Here we also introduce PrePhyloPro, a web-based software that uses our method for accurately predicting proteome-wide linkages. We present data on interactions of human mitochondrial proteins, verifying the performance of this software. PrePhyloPro is freely available at http://prephylopro.org/phyloprofile/. PMID:28875072
Nowakiewicz, Aneta; Ziółkowska, Grażyna; Zięba, Przemysław; Gnat, Sebastian; Trościańczyk, Aleksandra; Adaszek, Łukasz
2017-01-01
The aim of this study was to characterize multidrug resistant E. faecalis strains from pigs of local origin and to analyse the relationship between resistance and genotypic and proteomic profiles by amplification of DNA fragments surrounding rare restriction sites (ADSRRS-fingerprinting) and matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI -TOF MS). From the total pool of Enterococcus spp. isolated from 90 pigs, we selected 36 multidrug resistant E. faecalis strains, which represented three different phenotypic resistance profiles. Phenotypic resistance to tetracycline, macrolides, phenicols, and lincomycin and high-level resistance to aminoglycosides were confirmed by the occurrence of at least one corresponding resistance gene in each strain. Based on the analysis of the genotypic and phenotypic resistance of the strains tested, five distinct resistance profiles were generated. As a complement of this analysis, profiles of virulence genes were determined and these profiles corresponded to the phenotypic resistance profiles. The demonstration of resistance to a wide panel of antimicrobials by the strains tested in this study indicates the need of typing to determine the spread of resistance also at the local level. It seems that in the case of E. faecalis, type and scope of resistance strongly determines the genotypic pattern obtained with the ADSRRS-fingerprinting method. The ADSRRS-fingerprinting analysis showed consistency of the genetic profiles with the resistance profiles, while analysis of data with the use of the MALDI- TOF MS method did not demonstrate direct reproduction of the clustering pattern obtained with this method. Our observations were confirmed by statistical analysis (Simpson’s index of diversity, Rand and Wallace coefficients). Even though the MALDI -TOF MS method showed slightly higher discrimination power than ADSRRS-fingerprinting, only the latter method allowed reproduction of the clustering pattern of isolates based on phenotypic resistance and analysis of resistance and virulence genes (Wallace coefficient 1.0). This feature seems to be the most useful for epidemiological purposes and short-term analysis. PMID:28135327
Yang, Xiao-Huan; Cheng, Xiao-Lan; Qin, Bing; Cai, Zhuo-Ya; Cai, Xiong; Liu, Shao; Wang, Qi; Qin, Yong
2016-05-30
The Kang-Jing (KJ) formula is a compound preparation made from 12 kinds of herbs. So far, four different methods (M1-M4) have been documented for KJ preparation, but the influence of preparation methods on the holistic quality of KJ have remained unknown. In this study, a strategy was proposed to investigate the influence of different preparation methods on the holistic quality of KJ using ultra-high performance liquid chromatography coupled with quadrupole/time of flight mass spectrometry (UHPLC-QTOF-MS/MS) based chemical profiling. A total of 101 compounds mainly belonging to flavonoids, tanshinones, monoterpene glycosides, triterpenoid saponins, alkaloids, phenolic acids and volatile oils, were identified. Among these compounds, glaucine was detected only in M3/M4 samples, while two dehydrocorydaline isomers merely detected in M2/M3/M4 samples. Tetrahydrocolumbamine, ethylic lithospermic acid, salvianolic acid E and rosmarimic acid were only detected in M1/M3/M4 samples. In the subsequent quantitative analysis, 12 major compounds were determined by UHPLC-MS/MS. The proposed method was validated with respect to linearity, accuracy, precision and recovery. It was found that the contents of marker compounds varied significantly in samples prepared by different methods. These results demonstrated that preparation method does significantly affect the holistic quality of KJ. UHPLC-QTOF-MS/MS based chemical profiling approach is efficient and reliable for comprehensive quality evaluation of KJ. Collectively, this study provide the chemical evidence for revealing the material basis of KJ, and establish a simple and accurate chemical profiling method for its quality control. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
Jimenez, Connie R; Piersma, Sander; Pham, Thang V
2007-12-01
Proteomics aims to create a link between genomic information, biological function and disease through global studies of protein expression, modification and protein-protein interactions. Recent advances in key proteomics tools, such as mass spectrometry (MS) and (bio)informatics, provide tremendous opportunities for biomarker-related clinical applications. In this review, we focus on two complementary MS-based approaches with high potential for the discovery of biomarker patterns and low-abundant candidate biomarkers in biofluids: high-throughput matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy-based methods for peptidome profiling and label-free liquid chromatography-based methods coupled to MS for in-depth profiling of biofluids with a focus on subproteomes, including the low-molecular-weight proteome, carrier-bound proteome and N-linked glycoproteome. The two approaches differ in their aims, throughput and sensitivity. We discuss recent progress and challenges in the analysis of plasma/serum and proximal fluids using these strategies and highlight the potential of liquid chromatography-MS-based proteomics of cancer cell and tumor secretomes for the discovery of candidate blood-based biomarkers. Strategies for candidate validation are also described.
Wan, Cen; Lees, Jonathan G; Minneci, Federico; Orengo, Christine A; Jones, David T
2017-10-01
Accurate gene or protein function prediction is a key challenge in the post-genome era. Most current methods perform well on molecular function prediction, but struggle to provide useful annotations relating to biological process functions due to the limited power of sequence-based features in that functional domain. In this work, we systematically evaluate the predictive power of temporal transcription expression profiles for protein function prediction in Drosophila melanogaster. Our results show significantly better performance on predicting protein function when transcription expression profile-based features are integrated with sequence-derived features, compared with the sequence-derived features alone. We also observe that the combination of expression-based and sequence-based features leads to further improvement of accuracy on predicting all three domains of gene function. Based on the optimal feature combinations, we then propose a novel multi-classifier-based function prediction method for Drosophila melanogaster proteins, FFPred-fly+. Interpreting our machine learning models also allows us to identify some of the underlying links between biological processes and developmental stages of Drosophila melanogaster.
NASA Astrophysics Data System (ADS)
Roman, Bart I.; Guedes, Rita C.; Stevens, Christian V.; García-Sosa, Alfonso T.
2018-05-01
In multitarget drug design, it is critical to identify active and inactive compounds against a variety of targets and antitargets. Multitarget strategies thus test the limits of available technology, be that in screening large databases of compounds versus a large number of targets, or in using in silico methods for understanding and reliably predicting these pharmacological outcomes. In this paper, we have evaluated the potential of several in silico approaches to predict the target, antitarget and physicochemical profile of (S)-blebbistatin, the best-known myosin II ATPase inhibitor, and a series of analogs thereof. Standard and augmented structure-based design techniques could not recover the observed activity profiles. A ligand-based method using molecular fingerprints was, however, able to select actives for myosin II inhibition. Using further ligand- and structure-based methods, we also evaluated toxicity through androgen receptor binding, affinity for an array of antitargets and the ADME profile (including assay-interfering compounds) of the series. In conclusion, in the search for (S)-blebbistatin analogs, the dissimilarity distance of molecular fingerprints to known actives and the computed antitarget and physicochemical profile of the molecules can be used for compound design for molecules with potential as tools for modulating myosin II and motility-related diseases.
Airline Passenger Profiling Based on Fuzzy Deep Machine Learning.
Zheng, Yu-Jun; Sheng, Wei-Guo; Sun, Xing-Ming; Chen, Sheng-Yong
2017-12-01
Passenger profiling plays a vital part of commercial aviation security, but classical methods become very inefficient in handling the rapidly increasing amounts of electronic records. This paper proposes a deep learning approach to passenger profiling. The center of our approach is a Pythagorean fuzzy deep Boltzmann machine (PFDBM), whose parameters are expressed by Pythagorean fuzzy numbers such that each neuron can learn how a feature affects the production of the correct output from both the positive and negative sides. We propose a hybrid algorithm combining a gradient-based method and an evolutionary algorithm for training the PFDBM. Based on the novel learning model, we develop a deep neural network (DNN) for classifying normal passengers and potential attackers, and further develop an integrated DNN for identifying group attackers whose individual features are insufficient to reveal the abnormality. Experiments on data sets from Air China show that our approach provides much higher learning ability and classification accuracy than existing profilers. It is expected that the fuzzy deep learning approach can be adapted for a variety of complex pattern analysis tasks.
Quantitative analysis of fracture surface by roughness and fractal method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.W.; Tian, J.F.; Kang, Y.
1995-09-01
In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less
Liu, Bin; Wang, Xiaolong; Lin, Lei; Dong, Qiwen; Wang, Xuan
2008-12-01
Protein remote homology detection and fold recognition are central problems in bioinformatics. Currently, discriminative methods based on support vector machine (SVM) are the most effective and accurate methods for solving these problems. A key step to improve the performance of the SVM-based methods is to find a suitable representation of protein sequences. In this paper, a novel building block of proteins called Top-n-grams is presented, which contains the evolutionary information extracted from the protein sequence frequency profiles. The protein sequence frequency profiles are calculated from the multiple sequence alignments outputted by PSI-BLAST and converted into Top-n-grams. The protein sequences are transformed into fixed-dimension feature vectors by the occurrence times of each Top-n-gram. The training vectors are evaluated by SVM to train classifiers which are then used to classify the test protein sequences. We demonstrate that the prediction performance of remote homology detection and fold recognition can be improved by combining Top-n-grams and latent semantic analysis (LSA), which is an efficient feature extraction technique from natural language processing. When tested on superfamily and fold benchmarks, the method combining Top-n-grams and LSA gives significantly better results compared to related methods. The method based on Top-n-grams significantly outperforms the methods based on many other building blocks including N-grams, patterns, motifs and binary profiles. Therefore, Top-n-gram is a good building block of the protein sequences and can be widely used in many tasks of the computational biology, such as the sequence alignment, the prediction of domain boundary, the designation of knowledge-based potentials and the prediction of protein binding sites.
Bernard R. Parresol; Charles E. Thomas
1996-01-01
In the wood utilization industry, both stem profile and biomass are important quantities. The two have traditionally been estimated separately. The introduction of a density-integral method allows for coincident estimation of stem profile and biomass, based on the calculus of mass theory, and provides an alternative to weight-ratio methodology. In the initial...
Bidargaddi, Niranjan P; Chetty, Madhu; Kamruzzaman, Joarder
2008-06-01
Profile hidden Markov models (HMMs) based on classical HMMs have been widely applied for protein sequence identification. The formulation of the forward and backward variables in profile HMMs is made under statistical independence assumption of the probability theory. We propose a fuzzy profile HMM to overcome the limitations of that assumption and to achieve an improved alignment for protein sequences belonging to a given family. The proposed model fuzzifies the forward and backward variables by incorporating Sugeno fuzzy measures and Choquet integrals, thus further extends the generalized HMM. Based on the fuzzified forward and backward variables, we propose a fuzzy Baum-Welch parameter estimation algorithm for profiles. The strong correlations and the sequence preference involved in the protein structures make this fuzzy architecture based model as a suitable candidate for building profiles of a given family, since the fuzzy set can handle uncertainties better than classical methods.
NASA Astrophysics Data System (ADS)
Denisenko, P. F.; Maltseva, O. A.; Sotsky, V. V.
2018-03-01
The method of correcting the daytime vertical profiles of electron plasma frequency in the low ionosphere from International Refererence Ionosphere (IRI) model in accordance with the measured data of the virtual heights and absorption of signal radiowaves (method A1) reflected from the bottom of E-region at vertical sounding (VS) is presented. The method is based on the replacement of the IRI model profile by an approximation of analytical dependence with parameters determined according to VS data and partially by the IRI model. The method is tested by the results of four joint ground-based and rocket experiments carried out in the 1970s at midlatitudes of the European part of Russia upon the launches of high-altitude geophysical rockets of the Vertical series. It is shown that the consideration of both virtual reflection heigths and absorption makes it possible to obtain electron density distributions that show the best agreement with the rocket measurements made at most height ranges in the D- and E-regions. In additional, the obtained distributions account more adequately than the IRI model for the contributions of D- and E-regions to absorption of signals reflected above these regions.
Phylo_dCor: distance correlation as a novel metric for phylogenetic profiling.
Sferra, Gabriella; Fratini, Federica; Ponzi, Marta; Pizzi, Elisabetta
2017-09-05
Elaboration of powerful methods to predict functional and/or physical protein-protein interactions from genome sequence is one of the main tasks in the post-genomic era. Phylogenetic profiling allows the prediction of protein-protein interactions at a whole genome level in both Prokaryotes and Eukaryotes. For this reason it is considered one of the most promising methods. Here, we propose an improvement of phylogenetic profiling that enables handling of large genomic datasets and infer global protein-protein interactions. This method uses the distance correlation as a new measure of phylogenetic profile similarity. We constructed robust reference sets and developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation that makes it applicable to large genomic data. Using Saccharomyces cerevisiae and Escherichia coli genome datasets, we showed that Phylo-dCor outperforms phylogenetic profiling methods previously described based on the mutual information and Pearson's correlation as measures of profile similarity. In this work, we constructed and assessed robust reference sets and propose the distance correlation as a measure for comparing phylogenetic profiles. To make it applicable to large genomic data, we developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation. Two R scripts that can be run on a wide range of machines are available upon request.
Profile-Based LC-MS Data Alignment—A Bayesian Approach
Tsai, Tsung-Heng; Tadesse, Mahlet G.; Wang, Yue; Ressom, Habtom W.
2014-01-01
A Bayesian alignment model (BAM) is proposed for alignment of liquid chromatography-mass spectrometry (LC-MS) data. BAM belongs to the category of profile-based approaches, which are composed of two major components: a prototype function and a set of mapping functions. Appropriate estimation of these functions is crucial for good alignment results. BAM uses Markov chain Monte Carlo (MCMC) methods to draw inference on the model parameters and improves on existing MCMC-based alignment methods through 1) the implementation of an efficient MCMC sampler and 2) an adaptive selection of knots. A block Metropolis-Hastings algorithm that mitigates the problem of the MCMC sampler getting stuck at local modes of the posterior distribution is used for the update of the mapping function coefficients. In addition, a stochastic search variable selection (SSVS) methodology is used to determine the number and positions of knots. We applied BAM to a simulated data set, an LC-MS proteomic data set, and two LC-MS metabolomic data sets, and compared its performance with the Bayesian hierarchical curve registration (BHCR) model, the dynamic time-warping (DTW) model, and the continuous profile model (CPM). The advantage of applying appropriate profile-based retention time correction prior to performing a feature-based approach is also demonstrated through the metabolomic data sets. PMID:23929872
Determination of boundaries between ranges of high and low gradient of beam profile.
Wendykier, Jacek; Bieniasiewicz, Marcin; Grządziel, Aleksandra; Jedynak, Tadeusz; Kośniewski, Wiktor; Reudelsdorf, Marta; Wendykier, Piotr
2016-01-01
This work addresses the problem of treatment planning system commissioning by introducing a new method of determination of boundaries between high and low gradient in beam profile. The commissioning of a treatment planning system is a very important task in the radiation therapy. One of the main goals of this task is to compare two field profiles: measured and calculated. Applying points of 80% and 120% of nominal field size can lead to the incorrect determination of boundaries, especially for small field sizes. The method that is based on the beam profile gradient allows for proper assignment of boundaries between high and low gradient regions even for small fields. TRS 430 recommendations for commissioning were used. The described method allows a separation between high and low gradient, because it directly uses the value of the gradient of a profile. For small fields, the boundaries determined by the new method allow a commissioning of a treatment planning system according to the TRS 430, while the point of 80% of nominal field size is already in the high gradient region. The method of determining the boundaries by using the beam profile gradient can be extremely helpful during the commissioning of the treatment planning system for Intensity Modulated Radiation Therapy or for other techniques which require very small field sizes.
Malinowski, Douglas P
2007-05-01
In recent years, the application of genomic and proteomic technologies to the problem of breast cancer prognosis and the prediction of therapy response have begun to yield encouraging results. Independent studies employing transcriptional profiling of primary breast cancer specimens using DNA microarrays have identified gene expression profiles that correlate with clinical outcome in primary breast biopsy specimens. Recent advances in microarray technology have demonstrated reproducibility, making clinical applications more achievable. In this regard, one such DNA microarray device based upon a 70-gene expression signature was recently cleared by the US FDA for application to breast cancer prognosis. These DNA microarrays often employ at least 70 gene targets for transcriptional profiling and prognostic assessment in breast cancer. The use of PCR-based methods utilizing a small subset of genes has recently demonstrated the ability to predict the clinical outcome in early-stage breast cancer. Furthermore, protein-based immunohistochemistry methods have progressed from using gene clusters and gene expression profiling to smaller subsets of expressed proteins to predict prognosis in early-stage breast cancer. Beyond prognostic applications, DNA microarray-based transcriptional profiling has demonstrated the ability to predict response to chemotherapy in early-stage breast cancer patients. In this review, recent advances in the use of multiple markers for prognosis of disease recurrence in early-stage breast cancer and the prediction of therapy response will be discussed.
Method and apparatus for creating time-optimal commands for linear systems
NASA Technical Reports Server (NTRS)
Seering, Warren P. (Inventor); Tuttle, Timothy D. (Inventor)
2004-01-01
A system for and method of determining an input command profile for substantially any dynamic system that can be modeled as a linear system, the input command profile for transitioning an output of the dynamic system from one state to another state. The present invention involves identifying characteristics of the dynamic system, selecting a command profile which defines an input to the dynamic system based on the identified characteristics, wherein the command profile comprises one or more pulses which rise and fall at switch times, imposing a plurality of constraints on the dynamic system, at least one of the constraints being defined in terms of the switch times, and determining the switch times for the input to the dynamic system based on the command profile and the plurality of constraints. The characteristics may be related to poles and zeros of the dynamic system, and the plurality of constraints may include a dynamics cancellation constraint which specifies that the input moves the dynamic system from a first state to a second state such that the dynamic system remains substantially at the second state.
Automated Detection of Knickpoints and Knickzones Across Transient Landscapes
NASA Astrophysics Data System (ADS)
Gailleton, B.; Mudd, S. M.; Clubb, F. J.
2017-12-01
Mountainous regions are ubiquitously dissected by river channels, which transmit climate and tectonic signals to the rest of the landscape by adjusting their long profiles. Fluvial response to allogenic forcing is often expressed through the upstream propagation of steepened reaches, referred to as knickpoints or knickzones. The identification and analysis of these steepened reaches has numerous applications in geomorphology, such as modelling long-term landscape evolution, understanding controls on fluvial incision, and constraining tectonic uplift histories. Traditionally, the identification of knickpoints or knickzones from fluvial profiles requires manual selection or calibration. This process is both time-consuming and subjective, as different workers may select different steepened reaches within the profile. We propose an objective, statistically-based method to systematically pick knickpoints/knickzones on a landscape scale using an outlier-detection algorithm. Our method integrates river profiles normalised by drainage area (Chi, using the approach of Perron and Royden, 2013), then separates the chi-elevation plots into a series of transient segments using the method of Mudd et al. (2014). This method allows the systematic detection of knickpoints across a DEM, regardless of size, using a high-performance algorithm implemented in the open-source Edinburgh Land Surface Dynamics Topographic Tools (LSDTopoTools) software package. After initial knickpoint identification, outliers are selected using several sorting and binning methods based on the Median Absolute Deviation, to avoid the influence sample size. We test our method on a series of DEMs and grid resolutions, and show that our method consistently identifies accurate knickpoint locations across each landscape tested.
NASA Astrophysics Data System (ADS)
Popa, CL; Popa, V.
2016-11-01
This paper proposes a profiling method for the tool which generates the helical groove of male rotor, screw compressor component. The method is based on a complementary theorem of surfaces enveloping - "Substitute Family Circles Method”. The specific theorem of family circles of substitution has been applied using AUTOCAD graphics design environment facility. The frontal view of the male rotor, screw compressor component, has been determinate knowing the transverse profile of female rotor, and using this theorem of "Substitute Family Circle". The three-dimensional model of the rotor makes possible to apply the same theorem, leading to the surface of revolution enveloping the helical surface. An application will be also presented to determine the axial profile of the disk cutter, numeric and graphics, following the proposed algorithm.
Prediction of the acoustic pressure above periodically uneven facings in industrial workplaces
NASA Astrophysics Data System (ADS)
Ducourneau, J.; Bos, L.; Planeau, V.; Faiz, Adil; Skali Lami, Salah; Nejade, A.
2010-05-01
The aim of this work is to predict sound pressure in front of wall facings based on periodic sound scattering surface profiles. The method involves investigating plane wave reflections randomly incident upon an uneven surface. The waveguide approach is well suited to the geometries usually encountered in industrial workplaces. This method simplifies the profile geometry by using elementary rectangular volumes. The acoustic field in the profile interstices can then be expressed as the superposition of waveguide modes. In past work, walls considered are of infinite dimensions and are subjected to a periodic surface profile in only one direction. We therefore generalise this approach by extending its applicability to "double-periodic" wall facings. Free-field measurements have been taken and the observed agreement between numerical and experimental results supports the validity of the waveguide method.
Evaluation of retrieval methods of daytime convective boundary layer height based on lidar data
NASA Astrophysics Data System (ADS)
Li, Hong; Yang, Yi; Hu, Xiao-Ming; Huang, Zhongwei; Wang, Guoyin; Zhang, Beidou; Zhang, Tiejun
2017-04-01
The atmospheric boundary layer height is a basic parameter in describing the structure of the lower atmosphere. Because of their high temporal resolution, ground-based lidar data are widely used to determine the daytime convective boundary layer height (CBLH), but the currently available retrieval methods have their advantages and drawbacks. In this paper, four methods of retrieving the CBLH (i.e., the gradient method, the idealized backscatter method, and two forms of the wavelet covariance transform method) from lidar normalized relative backscatter are evaluated, using two artificial cases (an idealized profile and a case similar to real profile), to test their stability and accuracy. The results show that the gradient method is suitable for high signal-to-noise ratio conditions. The idealized backscatter method is less sensitive to the first estimate of the CBLH; however, it is computationally expensive. The results obtained from the two forms of the wavelet covariance transform method are influenced by the selection of the initial input value of the wavelet amplitude. Further sensitivity analysis using real profiles under different orders of magnitude of background counts show that when different initial input values are set, the idealized backscatter method always obtains consistent CBLH. For two wavelet methods, the different CBLH are always obtained with the increase in the wavelet amplitude when noise is significant. Finally, the CBLHs as measured by three lidar-based methods are evaluated by as measured from L-band soundings. The boundary layer heights from two instruments coincide with ±200 m in most situations.
NASA Astrophysics Data System (ADS)
Schneider, M.; Hase, F.; Blumenstock, T.
2006-10-01
We propose an innovative approach for analysing ground-based FTIR spectra which allows us to detect variabilities of lower and middle/upper tropospheric HDO/H2O ratios. We show that the proposed method is superior to common approaches. We estimate that lower tropospheric HDO/H2O ratios can be detected with a noise to signal ratio of 15% and middle/upper tropospheric ratios with a noise to signal ratio of 50%. The method requires the inversion to be performed on a logarithmic scale and to introduce an inter-species constraint. While common methods calculate the isotope ratio posterior to an independent, optimal estimation of the HDO and H2O profile, the proposed approach is an optimal estimator for the ratio itself. We apply the innovative approach to spectra measured continuously during 15 months and present, for the first time, an annual cycle of tropospheric HDO/H2O ratio profiles as detected by ground-based measurements. Outliers in the detected middle/upper tropospheric ratios are interpreted by backward trajectories.
NASA Astrophysics Data System (ADS)
Schneider, M.; Hase, F.; Blumenstock, T.
2006-06-01
We propose an innovative approach for analysing ground-based FTIR spectra which allows us to detect variabilities of lower and middle/upper tropospheric HDO/H2O ratios. We show that the proposed method is superior to common approaches. We estimate that lower tropospheric HDO/H2O ratios can be detected with a noise to signal ratio of 15% and middle/upper tropospheric ratios with a noise to signal ratio of 50%. The method requires the inversion to be performed on a logarithmic scale and to introduce an inter-species constraint. While common methods calculate the isotope ratio posterior to an independent, optimal estimation of the HDO and H2O profile, the proposed approach is an optimal estimator for the ratio itself. We apply the innovative approach to spectra measured continuously during 15 months and present, for the first time, an annual cycle of tropospheric HDO/H2O ratio profiles as detected by ground-based measurements. Outliers in the detected middle/upper tropospheric ratios are interpreted by backward trajectories.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Optical characterization of high speed microscanners based on static slit profiling method
NASA Astrophysics Data System (ADS)
Alaa Elhady, A.; Sabry, Yasser M.; Khalil, Diaa
2017-01-01
Optical characterization of high-speed microscanners is a challenging task that usually requires special high speed, extremely expensive camera systems. This paper presents a novel simple method to characterize the scanned beam spot profile and size in high-speed optical scanners under operation. It allows measuring the beam profile and the spot sizes at different scanning angles. The method is analyzed theoretically and applied experimentally on the characterization of a Micro Electro Mechanical MEMS scanner operating at 2.6 kHz. The variation of the spot size versus the scanning angle, up to ±15°, is extracted and the dynamic bending curvature effect of the micromirror is predicted.
A simple and inexpensive method for muddy shore profiling
NASA Astrophysics Data System (ADS)
Chowdhury, Sayedur Rahman; Hossain, M. Shahadat; Sharifuzzaman, S. M.
2014-11-01
There are several well-established methods for obtaining beach profiles, and more accurate and precise high-tech methods are emerging. Traditional low-cost methods requiring minimal user skill or training are still popular among professionals, scientists, and coastal zone management practitioners. Simple methods are being developed with a primary focus on sand and gravel beaches. This paper describes a simple, low-cost, manual field method for measuring profiles of beaches, which is particularly suitable for muddy shores. The equipment is a type of flexible U-tube manometer that uses liquid columns in vertical tubes to measure differences in elevation; the supporting frame is constructed from wooden poles with base disks, which hold measuring scales and a PVC tube. The structure was trialed on a mudflat characterized by a 20-40-cm-thick surface layer of silt and clay, located at the Kutubdia Island, Bangladesh. The study results are discussed with notes on the method's applicability, advantages and limitations, and several optional modifications for different scenarios for routine profiling of muddy shores. The equipment can be used by one person or two people, and the accuracy of the method is comparable to those in other methods. The equipment can also be used on sandy or gravel beaches.
Modal analysis of circular Bragg fibers with arbitrary index profiles
NASA Astrophysics Data System (ADS)
Horikis, Theodoros P.; Kath, William L.
2006-12-01
A finite-difference approach based upon the immersed interface method is used to analyze the mode structure of Bragg fibers with arbitrary index profiles. The method allows general propagation constants and eigenmodes to be calculated to a high degree of accuracy, while computation times are kept to a minimum by exploiting sparse matrix algebra. The method is well suited to handle complicated structures comprised of a large number of thin layers with high-index contrast and simultaneously determines multiple eigenmodes without modification.
FIBER AND INTEGRATED OPTICS: New method for determination of the parameters of a channel waveguide
NASA Astrophysics Data System (ADS)
Galechyan, M. G.; Dianov, Evgenii M.; Lyndin, N. M.; Sychugov, V. A.; Tishchenko, A. V.; Usievich, B. A.
1992-02-01
A new method for the determination of the parameters of channel integrated optical waveguides is proposed. This method is based on measuring the spectral transmission of a system comprising the investigated waveguide and single-mode fiber waveguides, which are brought into contact with the channel waveguide. The results are reported of an investigation of two channel waveguides formed in glass by a variety of methods and characterized by different refractive index profiles. The proposed method is found to be suitable for determination of the parameters of the refractive index profile of the investigated channel waveguides.
Aerosol Extinction Profile Mapping with Lognormal Distribution Based on MPL Data
NASA Astrophysics Data System (ADS)
Lin, T. H.; Lee, T. T.; Chang, K. E.; Lien, W. H.; Liu, G. R.; Liu, C. Y.
2017-12-01
This study intends to challenge the profile mapping of aerosol vertical distribution by mathematical function. With the similarity in distribution pattern, lognormal distribution is examined for mapping the aerosol extinction profile based on MPL (Micro Pulse LiDAR) in situ measurements. The variables of lognormal distribution are log mean (μ) and log standard deviation (σ), which will be correlated with the parameters of aerosol optical depht (AOD) and planetary boundary layer height (PBLH) associated with the altitude of extinction peak (Mode) defined in this study. On the base of 10 years MPL data with single peak, the mapping results showed that the mean error of Mode and σ retrievals are 16.1% and 25.3%, respectively. The mean error of σ retrieval can be reduced to 16.5% under the cases of larger distance between PBLH and Mode. The proposed method is further applied to MODIS AOD product in mapping extinction profile for the retrieval of PM2.5 in terms of satellite observations. The results indicated well agreement between retrievals and ground measurements when aerosols under 525 meters are well-mixed. The feasibility of proposed method to satellite remote sensing is also suggested by the case study. Keyword: Aerosol extinction profile, Lognormal distribution, MPL, Planetary boundary layer height (PBLH), Aerosol optical depth (AOD), Mode
Xu, Jin-Di; Mao, Qian; Shen, Hong; Zhu, Ling-Ying; Li, Song-Lin; Yan, Ru
2013-08-23
Qiong-Yu-Gao (QYG), consisting of Rehmanniae Radix (RR), Poriae (PO) and Ginseng Radix (GR), is a commonly used tonic traditional complex herbal medicine (CHM). So far, three different methods have been documented for preparation of QYG, i.e. method 1 (M1): mixing powders of GR and PO with decoction of RR; method 2 (M2): combining the decoction of RR and PO with the decoction of GR; method 3 (M3): decocting the mixture of RR, GR and PO. In present study, an ultra-high performance liquid chromatography coupled with photo-diode array and quadrupole/time-of-flight mass spectrometry (UHPLC-PDA-QTOF-MS/MS) based chemical profiling approach was developed to investigate the influence of the three preparation methods on the holistic quality of QYG. All detected peaks were unambiguously identified by comparing UV spectra, accurate mass data/characteristic mass fragments and retention times with those of reference compounds, and/or tentatively assigned by matching empirical molecular formula with that of known compounds, and/or elucidating quasi-molecular ions and fragment ions referring to information available in literature. A total of 103 components, mainly belonging to ginsenosides, phenethylalcohol glycosides, iridoid glycosides and triterpenoid acids, were identified, of which 5 degraded ginsenosides were putatively determined to be newly generated during preparation procedures of QYG samples. Triterpenoid acids and malonyl-ginsenosides were detected only in M1 samples, while degraded ginsenosides were merely detectable in M2/M3 samples. The possible reasons for the difference among chemical profiles of QYG samples prepared with three methods were also discussed. It could be concluded that preparation method do significantly affect the holistic quality of QYG. The influence of the altered chemical profiles on the bioactivity of QYG needs further investigation. The present study demonstrated that UHPLC-PDA-QTOF-MS/MS based chemical profiling approach is efficient and reliable for evaluating the holistic quality of traditional CHM. Copyright © 2013 Elsevier B.V. All rights reserved.
Gene expression inference with deep learning.
Chen, Yifei; Li, Yi; Narayan, Rajiv; Subramanian, Aravind; Xie, Xiaohui
2016-06-15
Large-scale gene expression profiling has been widely used to characterize cellular states in response to various disease conditions, genetic perturbations, etc. Although the cost of whole-genome expression profiles has been dropping steadily, generating a compendium of expression profiling over thousands of samples is still very expensive. Recognizing that gene expressions are often highly correlated, researchers from the NIH LINCS program have developed a cost-effective strategy of profiling only ∼1000 carefully selected landmark genes and relying on computational methods to infer the expression of remaining target genes. However, the computational approach adopted by the LINCS program is currently based on linear regression (LR), limiting its accuracy since it does not capture complex nonlinear relationship between expressions of genes. We present a deep learning method (abbreviated as D-GEX) to infer the expression of target genes from the expression of landmark genes. We used the microarray-based Gene Expression Omnibus dataset, consisting of 111K expression profiles, to train our model and compare its performance to those from other methods. In terms of mean absolute error averaged across all genes, deep learning significantly outperforms LR with 15.33% relative improvement. A gene-wise comparative analysis shows that deep learning achieves lower error than LR in 99.97% of the target genes. We also tested the performance of our learned model on an independent RNA-Seq-based GTEx dataset, which consists of 2921 expression profiles. Deep learning still outperforms LR with 6.57% relative improvement, and achieves lower error in 81.31% of the target genes. D-GEX is available at https://github.com/uci-cbcl/D-GEX CONTACT: xhx@ics.uci.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Gene expression inference with deep learning
Chen, Yifei; Li, Yi; Narayan, Rajiv; Subramanian, Aravind; Xie, Xiaohui
2016-01-01
Motivation: Large-scale gene expression profiling has been widely used to characterize cellular states in response to various disease conditions, genetic perturbations, etc. Although the cost of whole-genome expression profiles has been dropping steadily, generating a compendium of expression profiling over thousands of samples is still very expensive. Recognizing that gene expressions are often highly correlated, researchers from the NIH LINCS program have developed a cost-effective strategy of profiling only ∼1000 carefully selected landmark genes and relying on computational methods to infer the expression of remaining target genes. However, the computational approach adopted by the LINCS program is currently based on linear regression (LR), limiting its accuracy since it does not capture complex nonlinear relationship between expressions of genes. Results: We present a deep learning method (abbreviated as D-GEX) to infer the expression of target genes from the expression of landmark genes. We used the microarray-based Gene Expression Omnibus dataset, consisting of 111K expression profiles, to train our model and compare its performance to those from other methods. In terms of mean absolute error averaged across all genes, deep learning significantly outperforms LR with 15.33% relative improvement. A gene-wise comparative analysis shows that deep learning achieves lower error than LR in 99.97% of the target genes. We also tested the performance of our learned model on an independent RNA-Seq-based GTEx dataset, which consists of 2921 expression profiles. Deep learning still outperforms LR with 6.57% relative improvement, and achieves lower error in 81.31% of the target genes. Availability and implementation: D-GEX is available at https://github.com/uci-cbcl/D-GEX. Contact: xhx@ics.uci.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26873929
[A new method for safety monitoring of natural dietary supplements--quality profile].
Wang, Juan; Wang, Li-Ping; Yang, Da-Jin; Chen, Bo
2008-07-01
A new method for safety monitoring of natural dietary supplements--quality profile was proposed. It would convert passive monitoring of synthetic drug to active, and guarantee the security of natural dietary supplements. Preliminary research on quality profile was completed by high performance liquid chromatography (HPLC) and mass spectrometry (MS). HPLC was employed to analyze chemical constituent profiles of natural dietary supplements. The separation was completed on C18 column with acetonitrile and water (0.05% H3PO4) as mobile phase, the detection wavelength was 223 nm. Based on HPLC, stability of quality profile had been studied, and abnormal compounds in quality profile had been analyzed after addition of phenolphthalein, sibutramine, rosiglitazone, glibenclamide and gliclazide. And by MS, detector worked with ESI +, capillary voltage: 3.5 kV, cone voltage: 30 V, extractor voltage: 4 V, RF lens voltage: 0.5 V, source temperature: 105 degrees C, desolvation temperature: 300 degrees C, desolvation gas flow rate: 260 L/h, cone gas flow rate: 50 L/h, full scan mass spectra: m/z 100-600. Abnormal compound in quality profile had been analyzed after addition of N-mono-desmethyl sibutramine. Quality profile based on HPLC had good stability (Similarity > 0.877). Addition of phenolphthalein, sibutramine, rosiglitazone, glibenclamide and gliclazide in natural dietary supplements could be reflected by HPLC, and addition of N-mono-desmethyl sibutramine in natural dietary supplements could be reflected by MS. Quality profile might monitor adulteration of natural dietary supplements, and prevent addition of synthetic drug after "approval".
NASA Astrophysics Data System (ADS)
Laoufi, Fatiha; Belbachir, Ahmed-Hafid; Benabadji, Noureddine; Zanoun, Abdelouahab
2011-10-01
We have mapped the region of Oran, Algeria, using multispectral remote sensing with different resolutions. For the identification of objects on the ground using their spectral signatures, two methods were applied to images from SPOT, LANDSAT, IRS-1 C and ASTER. The first one is called Base Rule method (BR method) and is based on a set of rules that must be met at each pixel in the different bands reflectance calibrated and henceforth it is assigned to a given class. The construction of these rules is based on the spectral profiles of popular classes in the scene studied. The second one is called Spectral Angle Mapper method (SAM method) and is based on the direct calculation of the spectral angle between the target vector representing the spectral profile of the desired class and the pixel vector whose components are numbered accounts in the different bands of the calibrated image reflectance. This new method was performed using PCSATWIN software developed by our own laboratory LAAR. After collecting a library of spectral signatures with multiple libraries, a detailed study of the principles and physical processes that can influence the spectral signature has been conducted. The final goal is to establish the range of variation of a spectral profile of a well-defined class and therefore to get precise bases for spectral rules. From the results we have obtained, we find that the supervised classification of these pixels by BR method derived from spectral signatures reduces the uncertainty associated with identifying objects by enhancing significantly the percentage of correct classification with very distinct classes.
Kayano, Mitsunori; Matsui, Hidetoshi; Yamaguchi, Rui; Imoto, Seiya; Miyano, Satoru
2016-04-01
High-throughput time course expression profiles have been available in the last decade due to developments in measurement techniques and devices. Functional data analysis, which treats smoothed curves instead of originally observed discrete data, is effective for the time course expression profiles in terms of dimension reduction, robustness, and applicability to data measured at small and irregularly spaced time points. However, the statistical method of differential analysis for time course expression profiles has not been well established. We propose a functional logistic model based on elastic net regularization (F-Logistic) in order to identify the genes with dynamic alterations in case/control study. We employ a mixed model as a smoothing method to obtain functional data; then F-Logistic is applied to time course profiles measured at small and irregularly spaced time points. We evaluate the performance of F-Logistic in comparison with another functional data approach, i.e. functional ANOVA test (F-ANOVA), by applying the methods to real and synthetic time course data sets. The real data sets consist of the time course gene expression profiles for long-term effects of recombinant interferon β on disease progression in multiple sclerosis. F-Logistic distinguishes dynamic alterations, which cannot be found by competitive approaches such as F-ANOVA, in case/control study based on time course expression profiles. F-Logistic is effective for time-dependent biomarker detection, diagnosis, and therapy. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Massman, William
1987-01-01
A semianalytical method for describing the mean wind profile and shear stress within plant canopies and for estimating the roughness length and the displacement height is presented. This method incorporates density and vertical structure of the canopy and includes simple parameterizations of the roughness sublayer and shelter factor. Some of the wind profiles examined are consistent with first-order closure techniques while others are consistent with second-order closure techniques. Some profiles show a shearless region near the base of the canopy; however, none displays a secondary maximum there. Comparing several different analytical expressions for the canopy wind profile against observations suggests that one particular type of profile (an Airy function which is associated with the triangular foliage surface area density distribution) is superior to the others. Because of the numerical simplicity of the methods outlined, it is suggested that they may be profitably used in large-scale models of plant-atmosphere exchanges.
Analyzing gene expression time-courses based on multi-resolution shape mixture model.
Li, Ying; He, Ye; Zhang, Yu
2016-11-01
Biological processes actually are a dynamic molecular process over time. Time course gene expression experiments provide opportunities to explore patterns of gene expression change over a time and understand the dynamic behavior of gene expression, which is crucial for study on development and progression of biology and disease. Analysis of the gene expression time-course profiles has not been fully exploited so far. It is still a challenge problem. We propose a novel shape-based mixture model clustering method for gene expression time-course profiles to explore the significant gene groups. Based on multi-resolution fractal features and mixture clustering model, we proposed a multi-resolution shape mixture model algorithm. Multi-resolution fractal features is computed by wavelet decomposition, which explore patterns of change over time of gene expression at different resolution. Our proposed multi-resolution shape mixture model algorithm is a probabilistic framework which offers a more natural and robust way of clustering time-course gene expression. We assessed the performance of our proposed algorithm using yeast time-course gene expression profiles compared with several popular clustering methods for gene expression profiles. The grouped genes identified by different methods are evaluated by enrichment analysis of biological pathways and known protein-protein interactions from experiment evidence. The grouped genes identified by our proposed algorithm have more strong biological significance. A novel multi-resolution shape mixture model algorithm based on multi-resolution fractal features is proposed. Our proposed model provides a novel horizons and an alternative tool for visualization and analysis of time-course gene expression profiles. The R and Matlab program is available upon the request. Copyright © 2016 Elsevier Inc. All rights reserved.
A Targeted Quantitative Proteomics Strategy for Global Kinome Profiling of Cancer Cells and Tissues*
Xiao, Yongsheng; Guo, Lei; Wang, Yinsheng
2014-01-01
Kinases are among the most intensively pursued enzyme superfamilies as targets for anti-cancer drugs. Large data sets on inhibitor potency and selectivity for more than 400 human kinases became available recently, offering the opportunity to design rationally novel kinase-based anti-cancer therapies. However, the expression levels and activities of kinases are highly heterogeneous among different types of cancer and even among different stages of the same cancer. The lack of effective strategy for profiling the global kinome hampers the development of kinase-targeted cancer chemotherapy. Here, we introduced a novel global kinome profiling method, based on our recently developed isotope-coded ATP-affinity probe and a targeted proteomic method using multiple-reaction monitoring (MRM), for assessing simultaneously the expression of more than 300 kinases in human cells and tissues. This MRM-based assay displayed much better sensitivity, reproducibility, and accuracy than the discovery-based shotgun proteomic method. Approximately 250 kinases could be routinely detected in the lysate of a single cell line. Additionally, the incorporation of iRT into MRM kinome library rendered our MRM kinome assay easily transferrable across different instrument platforms and laboratories. We further employed this approach for profiling kinase expression in two melanoma cell lines, which revealed substantial kinome reprogramming during cancer progression and demonstrated an excellent correlation between the anti-proliferative effects of kinase inhibitors and the expression levels of their target kinases. Therefore, this facile and accurate kinome profiling assay, together with the kinome-inhibitor interaction map, could provide invaluable knowledge to predict the effectiveness of kinase inhibitor drugs and offer the opportunity for individualized cancer chemotherapy. PMID:24520089
NASA Astrophysics Data System (ADS)
Samanta, Swagata; Dey, Pradip Kumar; Banerji, Pallab; Ganguly, Pranabendu
2017-01-01
A study regarding the validity of effective-index based matrix method (EIMM) for the fabricated SU-8 channel waveguides is reported. The design method is extremely fast compared to other existing numerical techniques, such as, BPM and FDTD. In EIMM, the effective index method was applied in depth direction of the waveguide and the resulted lateral index profile was analyzed by a transfer matrix method. By EIMM one can compute the guided mode propagation constants and mode profiles for each mode for any dimensions of the waveguides. The technique may also be used to design single mode waveguide. SU-8 waveguide fabrication was carried out by continuous-wave direct laser writing process at 375 nm wavelength. The measured propagation losses of these wire waveguides having air and PDMS as superstrates were 0.51 dB/mm and 0.3 dB/mm respectively. The number of guided modes, obtained theoretically as well as experimentally, for air-cladded waveguide was much more than that of PDMS-cladded waveguide. We were able to excite the isolated fundamental mode for the later by precise fiber positioning, and mode image was recorded. The mode profiles, mode indices, and refractive index profiles were extracted from this mode image of the fundamental mode which matched remarkably well with the theoretical predictions.
Bachim, Brent L; Gaylord, Thomas K
2005-01-20
A new technique, microinterferometric optical phase tomography, is introduced for use in measuring small, asymmetric refractive-index differences in the profiles of optical fibers and fiber devices. The method combines microscopy-based fringe-field interferometry with parallel projection-based computed tomography to characterize fiber index profiles. The theory relating interference measurements to the projection set required for tomographic reconstruction is given, and discrete numerical simulations are presented for three test index profiles that establish the technique's ability to characterize fiber with small, asymmetric index differences. An experimental measurement configuration and specific interferometry and tomography practices employed in the technique are discussed.
NASA Astrophysics Data System (ADS)
Deng, Hui; Chen, Genyu; He, Jie; Zhou, Cong; Du, Han; Wang, Yanyi
2016-06-01
In this study, an online, efficient and precision laser profiling approach that is based on a single-layer deep-cutting intermittent feeding method is described. The effects of the laser cutting depth and the track-overlap ratio of the laser cutting on the efficiency, precision and quality of laser profiling were investigated. Experiments on the online profiling of bronze-bonded diamond grinding wheels were performed using a pulsed fiber laser. The results demonstrate that an increase in the laser cutting depth caused an increase in the material removal efficiency during the laser profiling process. However, the maximum laser profiling efficiency was only achieved when the laser cutting depth was equivalent to the initial surface contour error of the grinding wheel. In addition, the selection of relatively high track-overlap ratios of laser cutting for the profiling of grinding wheels was beneficial with respect to the increase in the precision of laser profiling, whereas the efficiency and quality of the laser profiling were not affected by the change in the track-overlap ratio. After optimized process parameters were employed for online laser profiling, the circular run-out error and the parallelism error of the grinding wheel surface decreased from 83.1 μm and 324.6 μm to 11.3 μm and 3.5 μm, respectively. The surface contour precision of the grinding wheel significantly improved. The highest surface contour precision for grinding wheels of the same type that can be theoretically achieved after laser profiling is completely dependent on the peak power density of the laser. The higher the laser peak power density is, the higher the surface contour precision of the grinding wheel after profiling.
Lee, Jae Won; Ji, Seung-Heon; Lee, Young-Seob; Choi, Doo Jin; Choi, Bo-Ram; Kim, Geum-Soog; Baek, Nam-In; Lee, Dae Young
2017-01-01
(1) Background: Panax ginseng root is one of the most important herbal products, and the profiling of ginsenosides is critical for the quality control of ginseng roots at different ages in the herbal markets. Furthermore, interest in assessing the contents as well as the localization of biological compounds has been growing. The objective of this study is to carry out the mass spectrometry (MS)-based profiling and imaging of ginsenosides to assess ginseng roots at different ages; (2) Methods: Optimal ultra performance liquid chromatography coupled to quadrupole time of flight/MS (UPLC-QTOF/MS) was used to profile various ginsenosides from P. ginseng roots. Matrix-assisted laser desorption ionization (MALDI)-time of flight (TOF)/MS-based imaging was also optimized to visualize ginsenosides in ginseng roots; (3) Results: UPLC-QTOF/MS was used to profile 30 ginsenosides with high mass accuracy, with an in-house library constructed for the fast and exact identification of ginsenosides. Using this method, the levels of 14 ginsenosides were assessed in P. ginseng roots cultivated for 4, 5, and 6 years. The optimal MALDI-imaging MS (IMS) was also applied to visualize the 14 ginsenosides in ginseng roots. As a result, the MSI cross sections showed the localization of 4 ginsenoside ions ([M + K]+) in P. ginseng roots at different ages; (4) Conclusions: The contents and localization of various ginsenosides differ depending on the cultivation years of P. ginseng roots. Furthermore, this study demonstrated the utility of MS-based profiling and imaging of ginsenosides for the quality control of ginseng roots. PMID:28538661
Lee, Jae Won; Ji, Seung-Heon; Lee, Young-Seob; Choi, Doo Jin; Choi, Bo-Ram; Kim, Geum-Soog; Baek, Nam-In; Lee, Dae Young
2017-05-24
(1) Background: Panax ginseng root is one of the most important herbal products, and the profiling of ginsenosides is critical for the quality control of ginseng roots at different ages in the herbal markets. Furthermore, interest in assessing the contents as well as the localization of biological compounds has been growing. The objective of this study is to carry out the mass spectrometry (MS)-based profiling and imaging of ginsenosides to assess ginseng roots at different ages; (2) Methods: Optimal ultra performance liquid chromatography coupled to quadrupole time of flight/MS (UPLC-QTOF/MS) was used to profile various ginsenosides from P. ginseng roots. Matrix-assisted laser desorption ionization (MALDI)-time of flight (TOF)/MS-based imaging was also optimized to visualize ginsenosides in ginseng roots; (3) Results: UPLC-QTOF/MS was used to profile 30 ginsenosides with high mass accuracy, with an in-house library constructed for the fast and exact identification of ginsenosides. Using this method, the levels of 14 ginsenosides were assessed in P. ginseng roots cultivated for 4, 5, and 6 years. The optimal MALDI-imaging MS (IMS) was also applied to visualize the 14 ginsenosides in ginseng roots. As a result, the MSI cross sections showed the localization of 4 ginsenoside ions ([M + K]⁺) in P. ginseng roots at different ages; (4) Conclusions: The contents and localization of various ginsenosides differ depending on the cultivation years of P. ginseng roots. Furthermore, this study demonstrated the utility of MS-based profiling and imaging of ginsenosides for the quality control of ginseng roots.
NASA Astrophysics Data System (ADS)
Zubiaga, A.; García, J. A.; Plazaola, F.; Tuomisto, F.; Zúñiga-Pérez, J.; Muñoz-Sanjosé, V.
2007-05-01
We present a method, based on positron annihilation spectroscopy, to obtain information on the defect depth profile of layers grown over high-quality substrates. We have applied the method to the case of ZnO layers grown on sapphire, but the method can be very easily generalized to other heterostructures (homostructures) where the positron mean diffusion length is small enough. Applying the method to the ratio of W and S parameters obtained from Doppler broadening measurements, W/S plots, it is possible to determine the thickness of the layer and the defect profile in the layer, when mainly one defect trapping positron is contributing to positron trapping at the measurement temperature. Indeed, the quality of such characterization is very important for potential technological applications of the layer.
High throughput profile-profile based fold recognition for the entire human proteome.
McGuffin, Liam J; Smith, Richard T; Bryson, Kevin; Sørensen, Søren-Aksel; Jones, David T
2006-06-07
In order to maintain the most comprehensive structural annotation databases we must carry out regular updates for each proteome using the latest profile-profile fold recognition methods. The ability to carry out these updates on demand is necessary to keep pace with the regular updates of sequence and structure databases. Providing the highest quality structural models requires the most intensive profile-profile fold recognition methods running with the very latest available sequence databases and fold libraries. However, running these methods on such a regular basis for every sequenced proteome requires large amounts of processing power. In this paper we describe and benchmark the JYDE (Job Yield Distribution Environment) system, which is a meta-scheduler designed to work above cluster schedulers, such as Sun Grid Engine (SGE) or Condor. We demonstrate the ability of JYDE to distribute the load of genomic-scale fold recognition across multiple independent Grid domains. We use the most recent profile-profile version of our mGenTHREADER software in order to annotate the latest version of the Human proteome against the latest sequence and structure databases in as short a time as possible. We show that our JYDE system is able to scale to large numbers of intensive fold recognition jobs running across several independent computer clusters. Using our JYDE system we have been able to annotate 99.9% of the protein sequences within the Human proteome in less than 24 hours, by harnessing over 500 CPUs from 3 independent Grid domains. This study clearly demonstrates the feasibility of carrying out on demand high quality structural annotations for the proteomes of major eukaryotic organisms. Specifically, we have shown that it is now possible to provide complete regular updates of profile-profile based fold recognition models for entire eukaryotic proteomes, through the use of Grid middleware such as JYDE.
Wiklund, Kristin; Olivera, Gustavo H; Brahme, Anders; Lind, Bengt K
2008-07-01
To speed up dose calculation, an analytical pencil-beam method has been developed to calculate the mean radial dose distributions due to secondary electrons that are set in motion by light ions in water. For comparison, radial dose profiles calculated using a Monte Carlo technique have also been determined. An accurate comparison of the resulting radial dose profiles of the Bragg peak for (1)H(+), (4)He(2+) and (6)Li(3+) ions has been performed. The double differential cross sections for secondary electron production were calculated using the continuous distorted wave-eikonal initial state method (CDW-EIS). For the secondary electrons that are generated, the radial dose distribution for the analytical case is based on the generalized Gaussian pencil-beam method and the central axis depth-dose distributions are calculated using the Monte Carlo code PENELOPE. In the Monte Carlo case, the PENELOPE code was used to calculate the whole radial dose profile based on CDW data. The present pencil-beam and Monte Carlo calculations agree well at all radii. A radial dose profile that is shallower at small radii and steeper at large radii than the conventional 1/r(2) is clearly seen with both the Monte Carlo and pencil-beam methods. As expected, since the projectile velocities are the same, the dose profiles of Bragg-peak ions of 0.5 MeV (1)H(+), 2 MeV (4)He(2+) and 3 MeV (6)Li(3+) are almost the same, with about 30% more delta electrons in the sub keV range from (4)He(2+)and (6)Li(3+) compared to (1)H(+). A similar behavior is also seen for 1 MeV (1)H(+), 4 MeV (4)He(2+) and 6 MeV (6)Li(3+), all classically expected to have the same secondary electron cross sections. The results are promising and indicate a fast and accurate way of calculating the mean radial dose profile.
Patil, Ajeetkumar; Bhat, Sujatha; Pai, Keerthilatha M; Rai, Lavanya; Kartha, V B; Chidangil, Santhosh
2015-09-08
An ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique has been developed by our group at Manipal, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from volunteers (normal, and different pre-malignant/malignant conditions) were recorded using this set-up. The protein profiles were analyzed using principal component analysis (PCA) to achieve objective detection and classification of malignant, premalignant and healthy conditions with high sensitivity and specificity. The HPLC-LIF protein profiling combined with PCA, as a routine method for screening, diagnosis, and staging of cervical cancer and oral cancer, is discussed in this paper. In recent years, proteomics techniques have advanced tremendously in life sciences and medical sciences for the detection and identification of proteins in body fluids, tissue homogenates and cellular samples to understand biochemical mechanisms leading to different diseases. Some of the methods include techniques like high performance liquid chromatography, 2D-gel electrophoresis, MALDI-TOF-MS, SELDI-TOF-MS, CE-MS and LC-MS techniques. We have developed an ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from healthy and volunteers with different malignant conditions were recorded by using this set-up. The protein profile data were analyzed using principal component analysis (PCA) for objective classification and detection of malignant, premalignant and healthy conditions. The method is extremely sensitive to detect proteins with limit of detection of the order of femto-moles. The HPLC-LIF combined with PCA as a potential proteomic method for the diagnosis of oral cancer and cervical cancer has been discussed in this paper. This article is part of a Special Issue entitled: Proteomics in India. Copyright © 2015 Elsevier B.V. All rights reserved.
Brender, Jeffrey R.; Zhang, Yang
2015-01-01
The formation of protein-protein complexes is essential for proteins to perform their physiological functions in the cell. Mutations that prevent the proper formation of the correct complexes can have serious consequences for the associated cellular processes. Since experimental determination of protein-protein binding affinity remains difficult when performed on a large scale, computational methods for predicting the consequences of mutations on binding affinity are highly desirable. We show that a scoring function based on interface structure profiles collected from analogous protein-protein interactions in the PDB is a powerful predictor of protein binding affinity changes upon mutation. As a standalone feature, the differences between the interface profile score of the mutant and wild-type proteins has an accuracy equivalent to the best all-atom potentials, despite being two orders of magnitude faster once the profile has been constructed. Due to its unique sensitivity in collecting the evolutionary profiles of analogous binding interactions and the high speed of calculation, the interface profile score has additional advantages as a complementary feature to combine with physics-based potentials for improving the accuracy of composite scoring approaches. By incorporating the sequence-derived and residue-level coarse-grained potentials with the interface structure profile score, a composite model was constructed through the random forest training, which generates a Pearson correlation coefficient >0.8 between the predicted and observed binding free-energy changes upon mutation. This accuracy is comparable to, or outperforms in most cases, the current best methods, but does not require high-resolution full-atomic models of the mutant structures. The binding interface profiling approach should find useful application in human-disease mutation recognition and protein interface design studies. PMID:26506533
Taylor, Terence E; Lacalle Muls, Helena; Costello, Richard W; Reilly, Richard B
2018-01-01
Asthma and chronic obstructive pulmonary disease (COPD) patients are required to inhale forcefully and deeply to receive medication when using a dry powder inhaler (DPI). There is a clinical need to objectively monitor the inhalation flow profile of DPIs in order to remotely monitor patient inhalation technique. Audio-based methods have been previously employed to accurately estimate flow parameters such as the peak inspiratory flow rate of inhalations, however, these methods required multiple calibration inhalation audio recordings. In this study, an audio-based method is presented that accurately estimates inhalation flow profile using only one calibration inhalation audio recording. Twenty healthy participants were asked to perform 15 inhalations through a placebo Ellipta™ DPI at a range of inspiratory flow rates. Inhalation flow signals were recorded using a pneumotachograph spirometer while inhalation audio signals were recorded simultaneously using the Inhaler Compliance Assessment device attached to the inhaler. The acoustic (amplitude) envelope was estimated from each inhalation audio signal. Using only one recording, linear and power law regression models were employed to determine which model best described the relationship between the inhalation acoustic envelope and flow signal. Each model was then employed to estimate the flow signals of the remaining 14 inhalation audio recordings. This process repeated until each of the 15 recordings were employed to calibrate single models while testing on the remaining 14 recordings. It was observed that power law models generated the highest average flow estimation accuracy across all participants (90.89±0.9% for power law models and 76.63±2.38% for linear models). The method also generated sufficient accuracy in estimating inhalation parameters such as peak inspiratory flow rate and inspiratory capacity within the presence of noise. Estimating inhaler inhalation flow profiles using audio based methods may be clinically beneficial for inhaler technique training and the remote monitoring of patient adherence.
Lacalle Muls, Helena; Costello, Richard W.; Reilly, Richard B.
2018-01-01
Asthma and chronic obstructive pulmonary disease (COPD) patients are required to inhale forcefully and deeply to receive medication when using a dry powder inhaler (DPI). There is a clinical need to objectively monitor the inhalation flow profile of DPIs in order to remotely monitor patient inhalation technique. Audio-based methods have been previously employed to accurately estimate flow parameters such as the peak inspiratory flow rate of inhalations, however, these methods required multiple calibration inhalation audio recordings. In this study, an audio-based method is presented that accurately estimates inhalation flow profile using only one calibration inhalation audio recording. Twenty healthy participants were asked to perform 15 inhalations through a placebo Ellipta™ DPI at a range of inspiratory flow rates. Inhalation flow signals were recorded using a pneumotachograph spirometer while inhalation audio signals were recorded simultaneously using the Inhaler Compliance Assessment device attached to the inhaler. The acoustic (amplitude) envelope was estimated from each inhalation audio signal. Using only one recording, linear and power law regression models were employed to determine which model best described the relationship between the inhalation acoustic envelope and flow signal. Each model was then employed to estimate the flow signals of the remaining 14 inhalation audio recordings. This process repeated until each of the 15 recordings were employed to calibrate single models while testing on the remaining 14 recordings. It was observed that power law models generated the highest average flow estimation accuracy across all participants (90.89±0.9% for power law models and 76.63±2.38% for linear models). The method also generated sufficient accuracy in estimating inhalation parameters such as peak inspiratory flow rate and inspiratory capacity within the presence of noise. Estimating inhaler inhalation flow profiles using audio based methods may be clinically beneficial for inhaler technique training and the remote monitoring of patient adherence. PMID:29346430
2010-01-01
Background Protein-protein interaction (PPI) plays essential roles in cellular functions. The cost, time and other limitations associated with the current experimental methods have motivated the development of computational methods for predicting PPIs. As protein interactions generally occur via domains instead of the whole molecules, predicting domain-domain interaction (DDI) is an important step toward PPI prediction. Computational methods developed so far have utilized information from various sources at different levels, from primary sequences, to molecular structures, to evolutionary profiles. Results In this paper, we propose a computational method to predict DDI using support vector machines (SVMs), based on domains represented as interaction profile hidden Markov models (ipHMM) where interacting residues in domains are explicitly modeled according to the three dimensional structural information available at the Protein Data Bank (PDB). Features about the domains are extracted first as the Fisher scores derived from the ipHMM and then selected using singular value decomposition (SVD). Domain pairs are represented by concatenating their selected feature vectors, and classified by a support vector machine trained on these feature vectors. The method is tested by leave-one-out cross validation experiments with a set of interacting protein pairs adopted from the 3DID database. The prediction accuracy has shown significant improvement as compared to InterPreTS (Interaction Prediction through Tertiary Structure), an existing method for PPI prediction that also uses the sequences and complexes of known 3D structure. Conclusions We show that domain-domain interaction prediction can be significantly enhanced by exploiting information inherent in the domain profiles via feature selection based on Fisher scores, singular value decomposition and supervised learning based on support vector machines. Datasets and source code are freely available on the web at http://liao.cis.udel.edu/pub/svdsvm. Implemented in Matlab and supported on Linux and MS Windows. PMID:21034480
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jun, Se -Ran; Hauser, Loren John; Schadt, Christopher Warren
For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost effective way to screen samples of interestmore » for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. As a result, we present a computational method called pangenome based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU s taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome s functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8 0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed reference OTU picking strategies against specific reference sequence databases. In conclusion, we developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub.« less
Dieltjes, Patrick; Mieremet, René; Zuniga, Sofia; Kraaijenbrink, Thirsa; Pijpe, Jeroen; de Knijff, Peter
2011-07-01
Exploring technological limits is a common practice in forensic DNA research. Reliable genetic profiling based on only a few cells isolated from trace material retrieved from a crime scene is nowadays more and more the rule rather than the exception. On many crime scenes, cartridges, bullets, and casings (jointly abbreviated as CBCs) are regularly found, and even after firing, these potentially carry trace amounts of biological material. Since 2003, the Forensic Laboratory for DNA Research is routinely involved in the forensic investigation of CBCs in the Netherlands. Reliable DNA profiles were frequently obtained from CBCs and used to match suspects, victims, or other crime scene-related DNA traces. In this paper, we describe the sensitive method developed by us to extract DNA from CBCs. Using PCR-based genotyping of autosomal short tandem repeats, we were able to obtain reliable and reproducible DNA profiles in 163 out of 616 criminal cases (26.5%) and in 283 out of 4,085 individual CBC items (6.9%) during the period January 2003-December 2009. We discuss practical aspects of the method and the sometimes unexpected effects of using cell lysis buffer on the subsequent investigation of striation patterns on CBCs.
McCarthy, David; Pulverer, Walter; Weinhaeusel, Andreas; Diago, Oscar R; Hogan, Daniel J; Ostertag, Derek; Hanna, Michelle M
2016-01-01
Aim: Development of a sensitive method for DNA methylation profiling and associated mutation detection in clinical samples. Materials & methods: Formalin-fixed and paraffin-embedded tumors received by clinical laboratories often contain insufficient DNA for analysis with bisulfite or methylation sensitive restriction enzymes-based methods. To increase sensitivity, methyl-CpG DNA capture and Coupled Abscription PCR Signaling detection were combined in a new assay, MethylMeter®. Gliomas were analyzed for MGMT methylation, glioma CpG island methylator phenotype and IDH1 R132H. Results: MethylMeter had 100% assay success rate measuring all five biomarkers in formalin-fixed and paraffin-embedded tissue. MGMT methylation results were supported by survival and mRNA expression data. Conclusion: MethylMeter is a sensitive and quantitative method for multitarget DNA methylation profiling and associated mutation detection. The MethylMeter-based GliomaSTRAT assay measures methylation of four targets and one mutation to simultaneously grade gliomas and predict their response to temozolomide. This information is clinically valuable in management of gliomas. PMID:27337298
NASA Astrophysics Data System (ADS)
Khamatnurova, M. Yu.; Gribanov, K. G.; Zakharov, V. I.; Rokotyan, N. V.; Imasu, R.
2017-11-01
The algorithm for atmospheric methane distribution retrieval in atmosphere from IASI spectra has been developed. The feasibility of Levenberg-Marquardt method for atmospheric methane total column amount retrieval from the spectra measured by IASI/METOP modified for the case of lack of a priori covariance matrices for methane vertical profiles is studied in this paper. Method and algorithm were implemented into software package together with iterative estimation of a posteriori covariance matrices and averaging kernels for each individual retrieval. This allows retrieval quality selection using the properties of both types of matrices. Methane (XCH4) retrieval by Levenberg-Marquardt method from IASI/METOP spectra is presented in this work. NCEP/NCAR reanalysis data provided by ESRL (NOAA, Boulder, USA) were taken as initial guess. Surface temperature, air temperature and humidity vertical profiles are retrieved before methane vertical profile retrieval. The data retrieved from ground-based measurements at the Ural Atmospheric Station and data of L2/IASI standard product were used for the verification of the method and results of methane retrieval from IASI/METOP spectra.
Spline-Based Smoothing of Airfoil Curvatures
NASA Technical Reports Server (NTRS)
Li, W.; Krist, S.
2008-01-01
Constrained fitting for airfoil curvature smoothing (CFACS) is a splinebased method of interpolating airfoil surface coordinates (and, concomitantly, airfoil thicknesses) between specified discrete design points so as to obtain smoothing of surface-curvature profiles in addition to basic smoothing of surfaces. CFACS was developed in recognition of the fact that the performance of a transonic airfoil is directly related to both the curvature profile and the smoothness of the airfoil surface. Older methods of interpolation of airfoil surfaces involve various compromises between smoothing of surfaces and exact fitting of surfaces to specified discrete design points. While some of the older methods take curvature profiles into account, they nevertheless sometimes yield unfavorable results, including curvature oscillations near end points and substantial deviations from desired leading-edge shapes. In CFACS as in most of the older methods, one seeks a compromise between smoothing and exact fitting. Unlike in the older methods, the airfoil surface is modified as little as possible from its original specified form and, instead, is smoothed in such a way that the curvature profile becomes a smooth fit of the curvature profile of the original airfoil specification. CFACS involves a combination of rigorous mathematical modeling and knowledge-based heuristics. Rigorous mathematical formulation provides assurance of removal of undesirable curvature oscillations with minimum modification of the airfoil geometry. Knowledge-based heuristics bridge the gap between theory and designers best practices. In CFACS, one of the measures of the deviation of an airfoil surface from smoothness is the sum of squares of the jumps in the third derivatives of a cubicspline interpolation of the airfoil data. This measure is incorporated into a formulation for minimizing an overall deviation- from-smoothness measure of the airfoil data within a specified fitting error tolerance. CFACS has been extensively tested on a number of supercritical airfoil data sets generated by inverse design and optimization computer programs. All of the smoothing results show that CFACS is able to generate unbiased smooth fits of curvature profiles, trading small modifications of geometry for increasing curvature smoothness by eliminating curvature oscillations and bumps (see figure).
Depth profile measurement with lenslet images of the plenoptic camera
NASA Astrophysics Data System (ADS)
Yang, Peng; Wang, Zhaomin; Zhang, Wei; Zhao, Hongying; Qu, Weijuan; Zhao, Haimeng; Asundi, Anand; Yan, Lei
2018-03-01
An approach for carrying out depth profile measurement of an object with the plenoptic camera is proposed. A single plenoptic image consists of multiple lenslet images. To begin with, these images are processed directly with a refocusing technique to obtain the depth map, which does not need to align and decode the plenoptic image. Then, a linear depth calibration is applied based on the optical structure of the plenoptic camera for depth profile reconstruction. One significant improvement of the proposed method concerns the resolution of the depth map. Unlike the traditional method, our resolution is not limited by the number of microlenses inside the camera, and the depth map can be globally optimized. We validated the method with experiments on depth map reconstruction, depth calibration, and depth profile measurement, with the results indicating that the proposed approach is both efficient and accurate.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
The Cross-Entropy Based Multi-Filter Ensemble Method for Gene Selection.
Sun, Yingqiang; Lu, Chengbo; Li, Xiaobo
2018-05-17
The gene expression profile has the characteristics of a high dimension, low sample, and continuous type, and it is a great challenge to use gene expression profile data for the classification of tumor samples. This paper proposes a cross-entropy based multi-filter ensemble (CEMFE) method for microarray data classification. Firstly, multiple filters are used to select the microarray data in order to obtain a plurality of the pre-selected feature subsets with a different classification ability. The top N genes with the highest rank of each subset are integrated so as to form a new data set. Secondly, the cross-entropy algorithm is used to remove the redundant data in the data set. Finally, the wrapper method, which is based on forward feature selection, is used to select the best feature subset. The experimental results show that the proposed method is more efficient than other gene selection methods and that it can achieve a higher classification accuracy under fewer characteristic genes.
Dietary guidelines in the Czech Republic. II.: Nutritional profiles of food groups.
Brázdová, Z; Fiala, J; Bauerová, J; Mullerová, D
2000-11-01
Modern dietary guidelines set in terms of food groups are easy to use and understand for target populations, but rather complicated from the point of view of quantification, i.e. the correctly set number of recommended servings in different population groups according to age, sex, physical activity and physiological status on the basis of required intake of energy and individual nutrients. It is the use of abstract comprehensive food groups that makes it impossible to use a simple database of food tables based on the content of nutrients in individual foods, rather than their groups. Using groups requires that their nutritional profiles be established, i.e. that an average content of nutrients and energy for individual groups be calculated. To calculate nutritional profiles for Czech dietary guidelines, the authors used three different methods: (1) Simple profiles, with all commodities with significant representation in the Czech food basket represented in equal amounts. (2) Profiles based on typical servings, with the same commodities as in (1) but in characteristic intake quantities (typical servings). (3) Food basket-based profiles with commodities constituting the Czech food basket in quantities identical for that basket. The results showed significant differences in profiles calculated by different methods. Calculated nutrient intakes were particularly influenced by the size of typical servings and it is therefore essential that a realistic size of servings be used in calculations. The consistent use of recommended food items throughout all food groups and subgroups is very important. The number of servings of foods from the five food groups is not enough if a suitable food item is not chosen within individual groups. On the basis of their findings, the authors fully recommend the use of nutritional profiles based on typical servings that give a realistic idea of the probable energy and nutrient content in the recommended daily intake. In view of regional cultural differences, national nutritional profiles play a vital importance. Population studies investigating the size of the typical servings and the most frequently occurring commodities in the food basket should be made every three years. Nutritional profiles designed in this way constitute an important starting point for setting national dietary guidelines, their implementation and revisions.
McCarthy, David; Pulverer, Walter; Weinhaeusel, Andreas; Diago, Oscar R; Hogan, Daniel J; Ostertag, Derek; Hanna, Michelle M
2016-06-01
Development of a sensitive method for DNA methylation profiling and associated mutation detection in clinical samples. Formalin-fixed and paraffin-embedded tumors received by clinical laboratories often contain insufficient DNA for analysis with bisulfite or methylation sensitive restriction enzymes-based methods. To increase sensitivity, methyl-CpG DNA capture and Coupled Abscription PCR Signaling detection were combined in a new assay, MethylMeter(®). Gliomas were analyzed for MGMT methylation, glioma CpG island methylator phenotype and IDH1 R132H. MethylMeter had 100% assay success rate measuring all five biomarkers in formalin-fixed and paraffin-embedded tissue. MGMT methylation results were supported by survival and mRNA expression data. MethylMeter is a sensitive and quantitative method for multitarget DNA methylation profiling and associated mutation detection. The MethylMeter-based GliomaSTRAT assay measures methylation of four targets and one mutation to simultaneously grade gliomas and predict their response to temozolomide. This information is clinically valuable in management of gliomas.
Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J
2013-01-01
Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.
DOT National Transportation Integrated Search
2013-06-01
The Indiana Department of Transportation (INDOT) is currently utilizing a profilograph and the profile index for measuring smoothness : assurance for newly constructed pavements. However, there are benefits to implementing a new IRI based smoothness ...
Identifying cancer biomarkers by mass spectrometry-based glycomics
Mechref, Yehia; Hu, Yunli; Garcia, Aldo; Hussein, Ahmed
2013-01-01
Correlations between aberrant glycosylation and cancer have been established for decades. The major advances in mass spectrometry (MS) and separation science have rapidly advanced detailed characterization of the changes associated with cancer development and progression. Over the past 10 years, many reports have described MS-based glycomic methods directed toward comparing the glycomic profiles of different human specimens collected from disease-free individuals and patients with cancers. Glycomic profiling of glycoproteins isolated from human specimens originating from disease-free individuals and patients with cancers have also been performed. Profiling of native, labeled, and permethylated glycans has been acquired using MALDI-MS and LC-MS. This review focuses on describing, discussing, and evaluating the different glycomic methods employed to characterize and quantify glycomic changes associated with cancers of different organs, including breast, colon, esophagus, liver, ovarian, pancreas, and prostate. PMID:22740464
Mirmohseni, A; Abdollahi, H; Rostamizadeh, K
2007-02-28
Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.
Target-depth estimation in active sonar: Cramer-Rao bounds for a bilinear sound-speed profile.
Mours, Alexis; Ioana, Cornel; Mars, Jérôme I; Josso, Nicolas F; Doisy, Yves
2016-09-01
This paper develops a localization method to estimate the depth of a target in the context of active sonar, at long ranges. The target depth is tactical information for both strategy and classification purposes. The Cramer-Rao lower bounds for the target position as range and depth are derived for a bilinear profile. The influence of sonar parameters on the standard deviations of the target range and depth are studied. A localization method based on ray back-propagation with a probabilistic approach is then investigated. Monte-Carlo simulations applied to a summer Mediterranean sound-speed profile are performed to evaluate the efficiency of the estimator. This method is finally validated on data in an experimental tank.
Using Corporate-Based Methods To Assess Technical Communication Programs.
ERIC Educational Resources Information Center
Faber, Brenton; Bekins, Linn; Karis, Bill
2002-01-01
Investigates methods of program assessment used by corporate learning sites and profiles value added methods as a way to both construct and evaluate academic programs in technical communication. Examines and critiques assessment methods from corporate training environments including methods employed by corporate universities and value added…
A novel isoflavone profiling method based on UPLC-PDA-ESI-MS.
Zhang, Shuang; Zheng, Zong-Ping; Zeng, Mao-Mao; He, Zhi-Yong; Tao, Guan-Jun; Qin, Fang; Chen, Jie
2017-03-15
A novel non-targeted isoflavone profiling method was developed using the diagnostic fragment-ion-based extension strategy, based on ultra-high performance liquid chromatography coupled with photo-diode array detector and electrospray ionization-mass spectrometry (UPLC-PDA-ESI-MS). 16 types of isoflavones were obtained in positive mode, but only 12 were obtained in negative mode due to the absence of precursor ions. Malonyldaidzin and malonylgenistin glycosylated at the 4'-O position or malonylated at the 4″-O position of glucose were indicated by their retention behavior and fragmentation pattern. Three possible quantification methods in one run based on UPLC-PDA and UPLC-ESI-MS were validated and compared, suggesting that methods based on UPLC-ESI-MS possess remarkable selectivity and sensitivity. Impermissible quantitative deviations induced by the linearity calibration with 400-fold dynamic range was observed for the first time and was recalibrated with a 20-fold dynamic range. These results suggest that isoflavones and their stereoisomers can be simultaneously determined by positive-ion UPLC-ESI-MS in soymilk. Copyright © 2016. Published by Elsevier Ltd.
A Profiling Float System for the North Arabian Sea
2017-11-29
purpose of this Defense University Research Instrumentation Program grant was to purchase a set of profiling floats to form an upper ocean observing ...purchase a set of profiling floats to form an upper ocean observing system for the Northern Arabian Sea Circulation - autonomous research (NASCar...resolution numerical simulations. To achieve these goals the DRI will utilize new observational methods that do not rely on a traditional ship-based
NASA Astrophysics Data System (ADS)
Jeong, Jeong-Won; Kim, Tae-Seong; Shin, Dae-Chul; Do, Synho; Marmarelis, Vasilis Z.
2004-04-01
Recently it was shown that soft tissue can be differentiated with spectral unmixing and detection methods that utilize multi-band information obtained from a High-Resolution Ultrasonic Transmission Tomography (HUTT) system. In this study, we focus on tissue differentiation using the spectral target detection method based on Constrained Energy Minimization (CEM). We have developed a new tissue differentiation method called "CEM filter bank". Statistical inference on the output of each CEM filter of a filter bank is used to make a decision based on the maximum statistical significance rather than the magnitude of each CEM filter output. We validate this method through 3-D inter/intra-phantom soft tissue classification where target profiles obtained from an arbitrary single slice are used for differentiation in multiple tomographic slices. Also spectral coherence between target and object profiles of an identical tissue at different slices and phantoms is evaluated by conventional cross-correlation analysis. The performance of the proposed classifier is assessed using Receiver Operating Characteristic (ROC) analysis. Finally we apply our method to classify tiny structures inside a beef kidney such as Styrofoam balls (~1mm), chicken tissue (~5mm), and vessel-duct structures.
NASA Astrophysics Data System (ADS)
Shah, Bhavana; Jiang, Xinzhao Grace; Chen, Louise; Zhang, Zhongqi
2014-06-01
Protein N-Glycan analysis is traditionally performed by high pH anion exchange chromatography (HPAEC), reversed phase liquid chromatography (RPLC), or hydrophilic interaction liquid chromatography (HILIC) on fluorescence-labeled glycans enzymatically released from the glycoprotein. These methods require time-consuming sample preparations and do not provide site-specific glycosylation information. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) peptide mapping is frequently used for protein structural characterization and, as a bonus, can potentially provide glycan profile on each individual glycosylation site. In this work, a recently developed glycopeptide fragmentation model was used for automated identification, based on their MS/MS, of N-glycopeptides from proteolytic digestion of monoclonal antibodies (mAbs). Experimental conditions were optimized to achieve accurate profiling of glycoforms. Glycan profiles obtained from LC-MS/MS peptide mapping were compared with those obtained from HPAEC, RPLC, and HILIC analyses of released glycans for several mAb molecules. Accuracy, reproducibility, and linearity of the LC-MS/MS peptide mapping method for glycan profiling were evaluated. The LC-MS/MS peptide mapping method with fully automated data analysis requires less sample preparation, provides site-specific information, and may serve as an alternative method for routine profiling of N-glycans on immunoglobulins as well as other glycoproteins with simple N-glycans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bielecki, J.; Scholz, M.; Drozdowicz, K.
A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape ofmore » normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.« less
NASA Astrophysics Data System (ADS)
Okada, Yukimasa; Ono, Kouichi; Eriguchi, Koji
2017-06-01
Aggressive shrinkage and geometrical transition to three-dimensional structures in metal-oxide-semiconductor field-effect transistors (MOSFETs) lead to potentially serious problems regarding plasma processing such as plasma-induced physical damage (PPD). For the precise control of material processing and future device designs, it is extremely important to clarify the depth and energy profiles of PPD. Conventional methods to estimate the PPD profile (e.g., wet etching) are time-consuming. In this study, we propose an advanced method using a simple capacitance-voltage (C-V) measurement. The method first assumes the depth and energy profiles of defects in Si substrates, and then optimizes the C-V curves. We applied this methodology to evaluate the defect generation in (100), (111), and (110) Si substrates. No orientation dependence was found regarding the surface-oxide layers, whereas a large number of defects was assigned in the case of (110). The damaged layer thickness and areal density were estimated. This method provides the highly sensitive PPD prediction indispensable for designing future low-damage plasma processes.
Scanning moiré and spatial-offset phase-stepping for surface inspection of structures
NASA Astrophysics Data System (ADS)
Yoneyama, S.; Morimoto, Y.; Fujigaki, M.; Ikeda, Y.
2005-06-01
In order to develop a high-speed and accurate surface inspection system of structures such as tunnels, a new surface profile measurement method using linear array sensors is studied. The sinusoidal grating is projected on a structure surface. Then, the deformed grating is scanned by linear array sensors that move together with the grating projector. The phase of the grating is analyzed by a spatial offset phase-stepping method to perform accurate measurement. The surface profile measurements of the wall with bricks and the concrete surface of a structure are demonstrated using the proposed method. The change of geometry or fabric of structures and the defects on structure surfaces can be detected by the proposed method. It is expected that the surface profile inspection system of tunnels measuring from a running train can be constructed based on the proposed method.
NASA Astrophysics Data System (ADS)
Britavskiy, N.; Pancino, E.; Tsymbal, V.; Romano, D.; Fossati, L.
2018-03-01
We present a radial velocity analysis of 20 solar neighbourhood RR Lyrae and three Population II Cepheid variables. We obtained high-resolution, moderate-to-high signal-to-noise ratio spectra for most stars; these spectra covered different pulsation phases for each star. To estimate the gamma (centre-of-mass) velocities of the programme stars, we use two independent methods. The first, `classic' method is based on RR Lyrae radial velocity curve templates. The second method is based on the analysis of absorption-line profile asymmetry to determine both pulsational and gamma velocities. This second method is based on the least-squares deconvolution (LSD) technique applied to analyse the line asymmetry that occurs in the spectra. We obtain measurements of the pulsation component of the radial velocity with an accuracy of ±3.5 km s-1. The gamma velocity was determined with an accuracy of ±10 km s-1, even for those stars having a small number of spectra. The main advantage of this method is the possibility of obtaining an estimation of gamma velocity even from one spectroscopic observation with uncertain pulsation phase. A detailed investigation of LSD profile asymmetry shows that the projection factor p varies as a function of the pulsation phase - this is a key parameter, which converts observed spectral line radial velocity variations into photospheric pulsation velocities. As a by-product of our study, we present 41 densely spaced synthetic grids of LSD profile bisectors based on atmospheric models of RR Lyr covering all pulsation phases.
A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis
Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan
2009-01-01
DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301
Profiling optimization for big data transfer over dedicated channels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, D.; Wu, Qishi; Rao, Nageswara S
The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less
Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis
Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq
2015-01-01
Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882
Using ontology-based annotation to profile disease research
Coulet, Adrien; LePendu, Paea; Shah, Nigam H
2012-01-01
Background Profiling the allocation and trend of research activity is of interest to funding agencies, administrators, and researchers. However, the lack of a common classification system hinders the comprehensive and systematic profiling of research activities. This study introduces ontology-based annotation as a method to overcome this difficulty. Analyzing over a decade of funding data and publication data, the trends of disease research are profiled across topics, across institutions, and over time. Results This study introduces and explores the notions of research sponsorship and allocation and shows that leaders of research activity can be identified within specific disease areas of interest, such as those with high mortality or high sponsorship. The funding profiles of disease topics readily cluster themselves in agreement with the ontology hierarchy and closely mirror the funding agency priorities. Finally, four temporal trends are identified among research topics. Conclusions This work utilizes disease ontology (DO)-based annotation to profile effectively the landscape of biomedical research activity. By using DO in this manner a use-case driven mechanism is also proposed to evaluate the utility of classification hierarchies. PMID:22494789
Profiling Student Use of Calculators in the Learning of High School Mathematics
ERIC Educational Resources Information Center
Crowe, Cheryll E.; Ma, Xin
2010-01-01
Using data from the 2005 National Assessment of Educational Progress, students' use of calculators in the learning of high school mathematics was profiled based on their family background, curriculum background, and advanced mathematics coursework. A statistical method new to educational research--classification and regression trees--was applied…
Reconstruction of refractive index profile of a stratified medium
NASA Astrophysics Data System (ADS)
Vogelzang, E.; Ferwerda, H. A.; Yevick, D.
In this paper, a method for determining the permittivity profile of a stratified medium terminated by a perfect conductor from the (complex) reflectivity is presented. The calculations are based on the Gelfand-Levitan and the Marchenko equations. The bound modes of the system are explicitly taken into account.
DMirNet: Inferring direct microRNA-mRNA association networks.
Lee, Minsu; Lee, HyungJune
2016-12-05
MicroRNAs (miRNAs) play important regulatory roles in the wide range of biological processes by inducing target mRNA degradation or translational repression. Based on the correlation between expression profiles of a miRNA and its target mRNA, various computational methods have previously been proposed to identify miRNA-mRNA association networks by incorporating the matched miRNA and mRNA expression profiles. However, there remain three major issues to be resolved in the conventional computation approaches for inferring miRNA-mRNA association networks from expression profiles. 1) Inferred correlations from the observed expression profiles using conventional correlation-based methods include numerous erroneous links or over-estimated edge weight due to the transitive information flow among direct associations. 2) Due to the high-dimension-low-sample-size problem on the microarray dataset, it is difficult to obtain an accurate and reliable estimate of the empirical correlations between all pairs of expression profiles. 3) Because the previously proposed computational methods usually suffer from varying performance across different datasets, a more reliable model that guarantees optimal or suboptimal performance across different datasets is highly needed. In this paper, we present DMirNet, a new framework for identifying direct miRNA-mRNA association networks. To tackle the aforementioned issues, DMirNet incorporates 1) three direct correlation estimation methods (namely Corpcor, SPACE, Network deconvolution) to infer direct miRNA-mRNA association networks, 2) the bootstrapping method to fully utilize insufficient training expression profiles, and 3) a rank-based Ensemble aggregation to build a reliable and robust model across different datasets. Our empirical experiments on three datasets demonstrate the combinatorial effects of necessary components in DMirNet. Additional performance comparison experiments show that DMirNet outperforms the state-of-the-art Ensemble-based model [1] which has shown the best performance across the same three datasets, with a factor of up to 1.29. Further, we identify 43 putative novel multi-cancer-related miRNA-mRNA association relationships from an inferred Top 1000 direct miRNA-mRNA association network. We believe that DMirNet is a promising method to identify novel direct miRNA-mRNA relations and to elucidate the direct miRNA-mRNA association networks. Since DMirNet infers direct relationships from the observed data, DMirNet can contribute to reconstructing various direct regulatory pathways, including, but not limited to, the direct miRNA-mRNA association networks.
Liu, Qiang; Chai, Tianyou; Wang, Hong; Qin, Si-Zhao Joe
2011-12-01
The continuous annealing process line (CAPL) of cold rolling is an important unit to improve the mechanical properties of steel strips in steel making. In continuous annealing processes, strip tension is an important factor, which indicates whether the line operates steadily. Abnormal tension profile distribution along the production line can lead to strip break and roll slippage. Therefore, it is essential to estimate the whole tension profile in order to prevent the occurrence of faults. However, in real annealing processes, only a limited number of strip tension sensors are installed along the machine direction. Since the effects of strip temperature, gas flow, bearing friction, strip inertia, and roll eccentricity can lead to nonlinear tension dynamics, it is difficult to apply the first-principles induced model to estimate the tension profile distribution. In this paper, a novel data-based hybrid tension estimation and fault diagnosis method is proposed to estimate the unmeasured tension between two neighboring rolls. The main model is established by an observer-based method using a limited number of measured tensions, speeds, and currents of each roll, where the tension error compensation model is designed by applying neural networks principal component regression. The corresponding tension fault diagnosis method is designed using the estimated tensions. Finally, the proposed tension estimation and fault diagnosis method was applied to a real CAPL in a steel-making company, demonstrating the effectiveness of the proposed method.
Geng, Haijiang; Li, Zhihui; Li, Jiabing; Lu, Tao; Yan, Fangrong
2015-01-01
BACKGROUND Personalized cancer treatments depend on the determination of a patient's genetic status according to known genetic profiles for which targeted treatments exist. Such genetic profiles must be scientifically validated before they is applied to general patient population. Reproducibility of findings that support such genetic profiles is a fundamental challenge in validation studies. The percentage of overlapping genes (POG) criterion and derivative methods produce unstable and misleading results. Furthermore, in a complex disease, comparisons between different tumor subtypes can produce high POG scores that do not capture the consistencies in the functions. RESULTS We focused on the quality rather than the quantity of the overlapping genes. We defined the rank value of each gene according to importance or quality by PageRank on basis of a particular topological structure. Then, we used the p-value of the rank-sum of the overlapping genes (PRSOG) to evaluate the quality of reproducibility. Though the POG scores were low in different studies of the same disease, the PRSOG was statistically significant, which suggests that sets of differentially expressed genes might be highly reproducible. CONCLUSIONS Evaluations of eight datasets from breast cancer, lung cancer and four other disorders indicate that quality-based PRSOG method performs better than a quantity-based method. Our analysis of the components of the sets of overlapping genes supports the utility of the PRSOG method. PMID:26556852
Detailed T1-Weighted Profiles from the Human Cortex Measured in Vivo at 3 Tesla MRI.
Ferguson, Bart; Petridou, Natalia; Fracasso, Alessio; van den Heuvel, Martijn P; Brouwer, Rachel M; Hulshoff Pol, Hilleke E; Kahn, René S; Mandl, René C W
2018-04-01
Studies into cortical thickness in psychiatric diseases based on T1-weighted MRI frequently report on aberrations in the cerebral cortex. Due to limitations in image resolution for studies conducted at conventional MRI field strengths (e.g. 3 Tesla (T)) this information cannot be used to establish which of the cortical layers may be implicated. Here we propose a new analysis method that computes one high-resolution average cortical profile per brain region extracting myeloarchitectural information from T1-weighted MRI scans that are routinely acquired at a conventional field strength. To assess this new method, we acquired standard T1-weighted scans at 3 T and compared them with state-of-the-art ultra-high resolution T1-weighted scans optimised for intracortical myelin contrast acquired at 7 T. Average cortical profiles were computed for seven different brain regions. Besides a qualitative comparison between the 3 T scans, 7 T scans, and results from literature, we tested if the results from dynamic time warping-based clustering are similar for the cortical profiles computed from 7 T and 3 T data. In addition, we quantitatively compared cortical profiles computed for V1, V2 and V7 for both 7 T and 3 T data using a priori information on their relative myelin concentration. Although qualitative comparisons show that at an individual level average profiles computed for 7 T have more pronounced features than 3 T profiles the results from the quantitative analyses suggest that average cortical profiles computed from T1-weighted scans acquired at 3 T indeed contain myeloarchitectural information similar to profiles computed from the scans acquired at 7 T. The proposed method therefore provides a step forward to study cortical myeloarchitecture in vivo at conventional magnetic field strength both in health and disease.
Unsupervised user similarity mining in GSM sensor networks.
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining.
Pabon, Peter; Ternström, Sten; Lamarche, Anick
2011-06-01
To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the contour, is assessed and also is compared to density-based VRP averaging methods that use the overlap count. VRP contours can be usefully described and compared using FDs. The method also permits the visualization of the local covariation along the contour average. For example, the FD-based analysis shows that the population variance for ensembles of VRP contours is usually smallest at the upper left part of the VRP. To illustrate the method's advantages and possible further application, graphs are given that compare the averaged contours from different authors and recording devices--for normal, trained, and untrained male and female voices as well as for child voices. The proposed technique allows any VRP shape to be brought to the same uniform base. On this uniform base, VRP contours or contour elements coming from a variety of sources may be placed within the same graph for comparison and for statistical analysis.
Optimal control of CPR procedure using hemodynamic circulation model
Lenhart, Suzanne M.; Protopopescu, Vladimir A.; Jung, Eunok
2007-12-25
A method for determining a chest pressure profile for cardiopulmonary resuscitation (CPR) includes the steps of representing a hemodynamic circulation model based on a plurality of difference equations for a patient, applying an optimal control (OC) algorithm to the circulation model, and determining a chest pressure profile. The chest pressure profile defines a timing pattern of externally applied pressure to a chest of the patient to maximize blood flow through the patient. A CPR device includes a chest compressor, a controller communicably connected to the chest compressor, and a computer communicably connected to the controller. The computer determines the chest pressure profile by applying an OC algorithm to a hemodynamic circulation model based on the plurality of difference equations.
NASA Astrophysics Data System (ADS)
Roger-Estrade, Jean; Boizard, Hubert; Peigné, Josephine; Sasal, Maria Carolina; Guimaraes, Rachel; Piron, Denis; Tomis, Vincent; Vian, Jean-François; Cadoux, Stephane; Ralisch, Ricardo; Filho, Tavares; Heddadj, Djilali; de Battista, Juan; Duparque, Annie
2016-04-01
In France, agronomists have studied the effects of cropping systems on soil structure, using a field method based on a visual description of soil structure. The "profil cultural" method (Manichon and Gautronneau, 1987) has been designed to perform a field diagnostic of the effects of tillage and compaction on soil structure dynamics. This method is of great use to agronomists improving crop management for a better preservation of soil structure. However, this method was developed and mainly used in conventional tillage systems, with ploughing. As several forms of reduced, minimum and no tillage systems are expanding in many parts of the world, it is necessary to re-evaluate the ability of this method to describe and interpret soil macrostructure in unploughed situations. In unploughed fields, soil structure dynamics of untilled layers is mainly driven by compaction and regeneration by natural agents (climatic conditions, root growth and macrofauna) and it is of major importance to evaluate the importance of these natural processes on soil structure regeneration. These concerns have led us to adapt the standard method and to propose amendments based on a series of field observations and experimental work in different situations of cropping systems, soil types and climatic conditions. We improved the description of crack type and we introduced an index of biological activity, based on the visual examination of clods. To test the improved method, a comparison with the reference method was carried out and the ability of the "profil cultural" method to make a diagnosis was tested on five experiments in France, Brazil and Argentina. Using the improved method, the impact of cropping systems on soil functioning was better assessed when natural processes were integrated into the description.
Science Teaching Orientations and Technology-Enhanced Tools for Student Learning
NASA Astrophysics Data System (ADS)
Campbell, Todd; Longhurst, Max; Duffy, Aaron M.; Wolf, Paul G.; Shelton, Brett E.
2013-10-01
This qualitative study examines teacher orientations and technology-enhanced tools for student learning within a science literacy framework. Data for this study came from a group of 10 eighth grade science teachers. Each of these teachers was a participant in a professional development (PD) project focused on reformed and technology-enhanced science instruction shaped by national standards documents. The research is focused on identifying teacher orientations and use of technology-enhanced tools prior to or unaffected by PD. The primary data sources for this study are drawn from learning journals and classroom observations. Qualitative methods were used to analyze learning journals, while descriptive statistics were used from classroom observations to further explore and triangulate the emergent qualitative findings. Two teacher orientation teacher profiles were developed to reveal the emergent teacher orientation dimensions and technology-enhanced tool categories found: "more traditional teacher orientation profile" and "toward a reformed-based teacher orientation profile." Both profiles were founded on "knowledge of" beliefs about the goals and purposes for science education, while neither profile revealed sophisticated beliefs about the nature of science. The "traditional" profile revealed more teacher-centered beliefs about science teaching and learning, and the "towards reformed-based" profile revealed student-centered beliefs. Finally, only technology-enhanced tools supportive of collaborative construction of science knowledge were found connected to the "towards reformed-based" profile. This research is concluded with a proposed "reformed-based teacher orientation profile" as a future target for science teaching and learning with technology-enhanced tools in a science literacy framework.
Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.
2007-01-01
Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302
Dense module enumeration in biological networks
NASA Astrophysics Data System (ADS)
Tsuda, Koji; Georgii, Elisabeth
2009-12-01
Analysis of large networks is a central topic in various research fields including biology, sociology, and web mining. Detection of dense modules (a.k.a. clusters) is an important step to analyze the networks. Though numerous methods have been proposed to this aim, they often lack mathematical rigorousness. Namely, there is no guarantee that all dense modules are detected. Here, we present a novel reverse-search-based method for enumerating all dense modules. Furthermore, constraints from additional data sources such as gene expression profiles or customer profiles can be integrated, so that we can systematically detect dense modules with interesting profiles. We report successful applications in human protein interaction network analyses.
Spectroscopic and Statistical Techniques for Information Recovery in Metabonomics and Metabolomics
NASA Astrophysics Data System (ADS)
Lindon, John C.; Nicholson, Jeremy K.
2008-07-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
Spectroscopic and statistical techniques for information recovery in metabonomics and metabolomics.
Lindon, John C; Nicholson, Jeremy K
2008-01-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
Kakuta, Shoji; Yamashita, Toshiyuki; Nishiumi, Shin; Yoshida, Masaru; Fukusaki, Eiichiro; Bamba, Takeshi
2015-01-01
A dynamic headspace extraction method (DHS) with high-pressure injection is described. This dynamic extraction method has superior sensitivity to solid phase micro extraction, SPME and is capable of extracting the entire gas phase by purging the headspace of a vial. Optimization of the DHS parameters resulted in a highly sensitive volatile profiling system with the ability to detect various volatile components including alcohols at nanogram levels. The average LOD for a standard volatile mixture was 0.50 ng mL−1, and the average LOD for alcohols was 0.66 ng mL−1. This method was used for the analysis of volatile components from biological samples and compared with acute and chronic inflammation models. The method permitted the identification of volatiles with the same profile pattern as in vitro oxidized lipid-derived volatiles. In addition, the concentration of alcohols and aldehydes from the acute inflammation model samples were significantly higher than that for the chronic inflammation model samples. The different profiles between these samples could also be identified by this method. Finally, it was possible to analyze alcohols and low-molecular-weight volatiles that are difficult to analyze by SPME in high sensitivity and to show volatile profiling based on multi-volatile simultaneous analysis. PMID:26819905
Kakuta, Shoji; Yamashita, Toshiyuki; Nishiumi, Shin; Yoshida, Masaru; Fukusaki, Eiichiro; Bamba, Takeshi
2015-01-01
A dynamic headspace extraction method (DHS) with high-pressure injection is described. This dynamic extraction method has superior sensitivity to solid phase micro extraction, SPME and is capable of extracting the entire gas phase by purging the headspace of a vial. Optimization of the DHS parameters resulted in a highly sensitive volatile profiling system with the ability to detect various volatile components including alcohols at nanogram levels. The average LOD for a standard volatile mixture was 0.50 ng mL(-1), and the average LOD for alcohols was 0.66 ng mL(-1). This method was used for the analysis of volatile components from biological samples and compared with acute and chronic inflammation models. The method permitted the identification of volatiles with the same profile pattern as in vitro oxidized lipid-derived volatiles. In addition, the concentration of alcohols and aldehydes from the acute inflammation model samples were significantly higher than that for the chronic inflammation model samples. The different profiles between these samples could also be identified by this method. Finally, it was possible to analyze alcohols and low-molecular-weight volatiles that are difficult to analyze by SPME in high sensitivity and to show volatile profiling based on multi-volatile simultaneous analysis.
Estimation of retinal vessel caliber using model fitting and random forests
NASA Astrophysics Data System (ADS)
Araújo, Teresa; Mendonça, Ana Maria; Campilho, Aurélio
2017-03-01
Retinal vessel caliber changes are associated with several major diseases, such as diabetes and hypertension. These caliber changes can be evaluated using eye fundus images. However, the clinical assessment is tiresome and prone to errors, motivating the development of automatic methods. An automatic method based on vessel crosssection intensity profile model fitting for the estimation of vessel caliber in retinal images is herein proposed. First, vessels are segmented from the image, vessel centerlines are detected and individual segments are extracted and smoothed. Intensity profiles are extracted perpendicularly to the vessel, and the profile lengths are determined. Then, model fitting is applied to the smoothed profiles. A novel parametric model (DoG-L7) is used, consisting on a Difference-of-Gaussians multiplied by a line which is able to describe profile asymmetry. Finally, the parameters of the best-fit model are used for determining the vessel width through regression using ensembles of bagged regression trees with random sampling of the predictors (random forests). The method is evaluated on the REVIEW public dataset. A precision close to the observers is achieved, outperforming other state-of-the-art methods. The method is robust and reliable for width estimation in images with pathologies and artifacts, with performance independent of the range of diameters.
Statistical inference methods for sparse biological time series data.
Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita
2011-04-25
Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.
Identification of salivary Lactobacillus rhamnosus species by DNA profiling and a specific probe.
Richard, B; Groisillier, A; Badet, C; Dorignac, G; Lonvaud-Funel, A
2001-03-01
The Lactobacillus genus has been shown to be associated with the dental carious process, but little is known about the species related to the decay, although Lactobacillus rhamnosus is suspected to be the most implicated species. Conventional identification methods based on biochemical criteria lead to ambiguous results, since the Lactobacillus species found in saliva are phenotypically close. To clarify the role of this genus in the evolution of carious disease, this work aimed to find a rapid and reliable method for identifying the L. rhamnosus species. Methods based on hybridization with DNA probes and DNA amplification by PCR were used. The dominant salivary Lactobacillus species (reference strains from the ATCC) were selected for this purpose as well as some wild strains isolated from children's saliva. DNA profiling using semirandom polymorphic DNA amplification (semi-RAPD) generated specific patterns for L. rhamnosus ATCC 7469. The profiles of all L. rhamnosus strains tested were similar and could be grouped; these strains shared four common fragments. Wild strains first identified with classic methods shared common patterns with the L. rhamnosus species and could be reclassified. One fragment of the profile was purified, cloned, used as a probe and found to be specific to the L. rhamnosus species. These results may help to localize this species within its ecological niche and to elucidate the progression of the carious process.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Godinho, Bruno M D C; Gilbert, James W; Haraszti, Reka A; Coles, Andrew H; Biscans, Annabelle; Roux, Loic; Nikan, Mehran; Echeverria, Dimas; Hassler, Matthew; Khvorova, Anastasia
2017-12-01
Therapeutic oligonucleotides, such as small interfering RNAs (siRNAs), hold great promise for the treatment of incurable genetically defined disorders by targeting cognate toxic gene products for degradation. To achieve meaningful tissue distribution and efficacy in vivo, siRNAs must be conjugated or formulated. Clear understanding of the pharmacokinetic (PK)/pharmacodynamic behavior of these compounds is necessary to optimize and characterize the performance of therapeutic oligonucleotides in vivo. In this study, we describe a simple and reproducible methodology for the evaluation of in vivo blood/plasma PK profiles and tissue distribution of oligonucleotides. The method is based on serial blood microsampling from the saphenous vein, coupled to peptide nucleic acid hybridization assay for quantification of guide strands. Performed with minimal number of animals, this method allowed unequivocal detection and sensitive quantification without the need for amplification, or further modification of the oligonucleotides. Using this methodology, we compared plasma clearances and tissue distribution profiles of two different hydrophobically modified siRNAs (hsiRNAs). Notably, cholesterol-hsiRNA presented slow plasma clearances and mainly accumulated in the liver, whereas, phosphocholine-docosahexaenoic acid-hsiRNA was rapidly cleared from the plasma and preferably accumulated in the kidney. These data suggest that the PK/biodistribution profiles of modified hsiRNAs are determined by the chemical nature of the conjugate. Importantly, the method described in this study constitutes a simple platform to conduct pilot assessments of the basic clearance and tissue distribution profiles, which can be broadly applied for evaluation of new chemical variants of siRNAs and micro-RNAs.
Method for 3D profilometry measurement based on contouring moire fringe
NASA Astrophysics Data System (ADS)
Shi, Zhiwei; Lin, Juhua
2007-12-01
3D shape measurement is one of the most active branches of optical research recently. A method of 3D profilometry measurement by the combination of Moire projection method and phase-shifting technology based on SCM (Single Chip Microcomputer) control is presented in the paper. Automatic measurement of 3D surface profiles can be carried out by applying this method with high speed and high precision.
2012-01-01
Background This study illustrates an evidence-based method for the segmentation analysis of patients that could greatly improve the approach to population-based medicine, by filling a gap in the empirical analysis of this topic. Segmentation facilitates individual patient care in the context of the culture, health status, and the health needs of the entire population to which that patient belongs. Because many health systems are engaged in developing better chronic care management initiatives, patient profiles are critical to understanding whether some patients can move toward effective self-management and can play a central role in determining their own care, which fosters a sense of responsibility for their own health. A review of the literature on patient segmentation provided the background for this research. Method First, we conducted a literature review on patient satisfaction and segmentation to build a survey. Then, we performed 3,461 surveys of outpatient services users. The key structures on which the subjects’ perception of outpatient services was based were extrapolated using principal component factor analysis with varimax rotation. After the factor analysis, segmentation was performed through cluster analysis to better analyze the influence of individual attitudes on the results. Results Four segments were identified through factor and cluster analysis: the “unpretentious,” the “informed and supported,” the “experts” and the “advanced” patients. Their policies and managerial implications are outlined. Conclusions With this research, we provide the following: – a method for profiling patients based on common patient satisfaction surveys that is easily replicable in all health systems and contexts; – a proposal for segments based on the results of a broad-based analysis conducted in the Italian National Health System (INHS). Segments represent profiles of patients requiring different strategies for delivering health services. Their knowledge and analysis might support an effort to build an effective population-based medicine approach. PMID:23256543
Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F
2017-04-01
Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.
Electric potential calculation in molecular simulation of electric double layer capacitors
NASA Astrophysics Data System (ADS)
Wang, Zhenxing; Olmsted, David L.; Asta, Mark; Laird, Brian B.
2016-11-01
For the molecular simulation of electric double layer capacitors (EDLCs), a number of methods have been proposed and implemented to determine the one-dimensional electric potential profile between the two electrodes at a fixed potential difference. In this work, we compare several of these methods for a model LiClO4-acetonitrile/graphite EDLC simulated using both the traditional fixed-charged method (FCM), in which a fixed charge is assigned a priori to the electrode atoms, or the recently developed constant potential method (CPM) (2007 J. Chem. Phys. 126 084704), where the electrode charges are allowed to fluctuate to keep the potential fixed. Based on an analysis of the full three-dimensional electric potential field, we suggest a method for determining the averaged one-dimensional electric potential profile that can be applied to both the FCM and CPM simulations. Compared to traditional methods based on numerically solving the one-dimensional Poisson’s equation, this method yields better accuracy and no supplemental assumptions.
Application of activity-based protein profiling to study enzyme function in adipocytes.
Galmozzi, Andrea; Dominguez, Eduardo; Cravatt, Benjamin F; Saez, Enrique
2014-01-01
Activity-based protein profiling (ABPP) is a chemical proteomics approach that utilizes small-molecule probes to determine the functional state of enzymes directly in native systems. ABPP probes selectively label active enzymes, but not their inactive forms, facilitating the characterization of changes in enzyme activity that occur without alterations in protein levels. ABPP can be a tool superior to conventional gene expression and proteomic profiling methods to discover new enzymes active in adipocytes and to detect differences in the activity of characterized enzymes that may be associated with disorders of adipose tissue function. ABPP probes have been developed that react selectively with most members of specific enzyme classes. Here, using as an example the serine hydrolase family that includes many enzymes with critical roles in adipocyte physiology, we describe methods to apply ABPP analysis to the study of adipocyte enzymatic pathways. © 2014 Elsevier Inc. All rights reserved.
Schumann, A; Priegnitz, M; Schoene, S; Enghardt, W; Rohling, H; Fiedler, F
2016-10-07
Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.
2014-01-01
Background Definitions and assessment methods of fussy/picky eating are heterogeneous and remain unclear. We aimed to identify an eating behavior profile reflecting fussy/picky eating in children and to describe characteristics of fussy eaters. Methods Eating behavior was assessed with the Child Eating Behavior Questionnaire (CEBQ) in 4914 4-year olds in a population-based birth cohort study. Latent Profile Analysis (LPA) was used to identify eating behavior profiles based on CEBQ subscales. Results and discussion We found a “fussy” eating behavior profile (5.6% of children) characterized by high food fussiness, slowness in eating, and satiety responsiveness in combination with low enjoyment of food and food responsiveness. Fussy eaters were more often from families with low household income than non-fussy eaters (42% vs. 31.8% respectively; Χ 2 (1) = 9.97, p < .01). When they were 14 months old, fussy eaters had a lower intake of vegetables (t [3008] = 2.42, p < .05) and fish (t [169.77] = 2.40, p < .05) but higher intake of savory snacks (t [153.69] = -2.03, p < .05) and sweets (t [3008] = -2.30, p < .05) compared to non-fussy eaters. Also, fussy eaters were more likely to be underweight at 4 years of age (19.3%) than non-fussy eaters (12.3%; Χ 2 (1) = 7.71, p < .01). Conclusions A distinct fussy eating behavior profile was identified by LPA, which was related to family and child characteristics, food intake, and BMI. This behavior profile might be used in future research and the development of interventions. PMID:24512388
Lu, Feng; Matsushita, Yasuyuki; Sato, Imari; Okabe, Takahiro; Sato, Yoichi
2015-10-01
We propose an uncalibrated photometric stereo method that works with general and unknown isotropic reflectances. Our method uses a pixel intensity profile, which is a sequence of radiance intensities recorded at a pixel under unknown varying directional illumination. We show that for general isotropic materials and uniformly distributed light directions, the geodesic distance between intensity profiles is linearly related to the angular difference of their corresponding surface normals, and that the intensity distribution of the intensity profile reveals reflectance properties. Based on these observations, we develop two methods for surface normal estimation; one for a general setting that uses only the recorded intensity profiles, the other for the case where a BRDF database is available while the exact BRDF of the target scene is still unknown. Quantitative and qualitative evaluations are conducted using both synthetic and real-world scenes, which show the state-of-the-art accuracy of smaller than 10 degree without using reference data and 5 degree with reference data for all 100 materials in MERL database.
An Inter-Personal Information Sharing Model Based on Personalized Recommendations
NASA Astrophysics Data System (ADS)
Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji
In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.
Smith, Lindsey; Ng, Shu Wen; Popkin, Barry M.
2015-01-01
Healthier foods initiatives (HFIs) by national food retailers offer an opportunity to improve the nutritional profile of packaged food purchases (PFPS). Using a longitudinal dataset of US household PFPs, with methods to account for selectivity of shopping at a specific retailer, we modeled the effect of Walmart’s HFI using counterfactual simulations to examine observed vs. expected changes in the nutritional profile of Walmart PFPs. From 2000 to 2013, Walmart PFPs showed major declines in energy, sodium, and sugar density, as well as declines in sugary beverages, grain-based desserts, snacks, and candy, beyond trends at similar retailers. However, post-HFI declines were similar to what we expected based on pre-HFI trends, suggesting that these changes were not attributable to Walmart’s HFI. These results suggest that food retailer-based HFIs may not be sufficient to improve the nutritional profile of food purchases. PMID:26526244
Grouping Youth With Similar Symptoms: A Person-Centered Approach to Transdiagnostic Subgroups.
Bonadio, F Tony; Dynes, Morgan; Lackey, Jennifer; Tompsett, Carolyn; Amrhein, Kelly
2016-07-01
The present study extracted symptom profiles based on parent and youth report on a broad symptom checklist. Profiles based on parent-reported symptoms were compared to those based on adolescent self-report to clarify discrepancies. The current study used archival data from 1,269 youth and parent dyads whose youth received services at a community mental health center. The mean age of the sample was 14.31 years (standard deviation = 1.98), and the youth sample was half male (50.1%) and primarily Caucasian (86.8%). Latent profile analysis was used to extract models based on parent and self-reported emotional and behavioral problems. Results indicated that a 5-class solution was the best fitting model for youth-reported symptoms and an adequate fit for parent-reported symptoms. For 46.5% of the sample, class membership matched for both parent and youth. Latent profile analysis provides an alternative method for exploring transdiagnostic subgroups within clinic-referred samples. © 2016 Wiley Periodicals, Inc.
Yang, Xiaoxia; Wang, Jia; Sun, Jun; Liu, Rong
2015-01-01
Protein-nucleic acid interactions are central to various fundamental biological processes. Automated methods capable of reliably identifying DNA- and RNA-binding residues in protein sequence are assuming ever-increasing importance. The majority of current algorithms rely on feature-based prediction, but their accuracy remains to be further improved. Here we propose a sequence-based hybrid algorithm SNBRFinder (Sequence-based Nucleic acid-Binding Residue Finder) by merging a feature predictor SNBRFinderF and a template predictor SNBRFinderT. SNBRFinderF was established using the support vector machine whose inputs include sequence profile and other complementary sequence descriptors, while SNBRFinderT was implemented with the sequence alignment algorithm based on profile hidden Markov models to capture the weakly homologous template of query sequence. Experimental results show that SNBRFinderF was clearly superior to the commonly used sequence profile-based predictor and SNBRFinderT can achieve comparable performance to the structure-based template methods. Leveraging the complementary relationship between these two predictors, SNBRFinder reasonably improved the performance of both DNA- and RNA-binding residue predictions. More importantly, the sequence-based hybrid prediction reached competitive performance relative to our previous structure-based counterpart. Our extensive and stringent comparisons show that SNBRFinder has obvious advantages over the existing sequence-based prediction algorithms. The value of our algorithm is highlighted by establishing an easy-to-use web server that is freely accessible at http://ibi.hzau.edu.cn/SNBRFinder.
Profiling users in the UNIX os environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dao, V N P; Vemuri, R; Templeton, S J
2000-09-29
This paper presents results obtained by using a method of profiling a user based on the login host, the login time, the command set, and the command set execution time of the profiled user. It is assumed that the user is logging onto a UNIX host on a computer network. The paper concentrates on two areas: short-term and long-term profiling. In short-term profiling the focus is on profiling the user at a given session where user characteristics do not change much. In long-term profiling, the duration of observation is over a much longer period of time. The latter is moremore » challenging because of a phenomenon called concept or profile drift. Profile drift occurs when a user logs onto a host for an extended period of time (over several sessions).« less
Duncan C. Lutes; Robert E. Keane
2006-01-01
The Fuel Load method (FL) is used to sample dead and down woody debris, determine depth of the duff/ litter profile, estimate the proportion of litter in the profile, and estimate total vegetative cover and dead vegetative cover. Down woody debris (DWD) is sampled using the planar intercept technique based on the methodology developed by Brown (1974). Pieces of dead...
Species-specific predictive models of developmental toxicity using the ToxCast chemical library
EPA’s ToxCastTM project is profiling the in vitro bioactivity of chemicals to generate predictive models that correlate with observed in vivo toxicity. In vitro profiling methods are based on ToxCast data, consisting of over 600 high-throughput screening (HTS) and high-content sc...
Estimating Soil Organic Carbon Stocks and Spatial Patterns with Statistical and GIS-Based Methods
Zhi, Junjun; Jing, Changwei; Lin, Shengpan; Zhang, Cao; Liu, Qiankun; DeGloria, Stephen D.; Wu, Jiaping
2014-01-01
Accurately quantifying soil organic carbon (SOC) is considered fundamental to studying soil quality, modeling the global carbon cycle, and assessing global climate change. This study evaluated the uncertainties caused by up-scaling of soil properties from the county scale to the provincial scale and from lower-level classification of Soil Species to Soil Group, using four methods: the mean, median, Soil Profile Statistics (SPS), and pedological professional knowledge based (PKB) methods. For the SPS method, SOC stock is calculated at the county scale by multiplying the mean SOC density value of each soil type in a county by its corresponding area. For the mean or median method, SOC density value of each soil type is calculated using provincial arithmetic mean or median. For the PKB method, SOC density value of each soil type is calculated at the county scale considering soil parent materials and spatial locations of all soil profiles. A newly constructed 1∶50,000 soil survey geographic database of Zhejiang Province, China, was used for evaluation. Results indicated that with soil classification levels up-scaling from Soil Species to Soil Group, the variation of estimated SOC stocks among different soil classification levels was obviously lower than that among different methods. The difference in the estimated SOC stocks among the four methods was lowest at the Soil Species level. The differences in SOC stocks among the mean, median, and PKB methods for different Soil Groups resulted from the differences in the procedure of aggregating soil profile properties to represent the attributes of one soil type. Compared with the other three estimation methods (i.e., the SPS, mean and median methods), the PKB method holds significant promise for characterizing spatial differences in SOC distribution because spatial locations of all soil profiles are considered during the aggregation procedure. PMID:24840890
NASA Astrophysics Data System (ADS)
Yang, Weitao; Li, Yuxiang; Ying, Sanjiu
2015-04-01
A fabrication process to produce graded porous and skin-core structure propellants via supercritical CO2 concentration profile is reported in this article. It utilizes a partial gas saturation technique to obtain nonequilibrium gas concentration profiles in propellants. Once foamed, the propellant obtains a graded porous or skin-pore structure. This fabrication method was studied with RDX(Hexogen)-based propellant under an SC-CO2 saturation condition. The principle was analyzed and the one-dimensional diffusion model was employed to estimate the gas diffusion coefficient and to predict the gas concentration profiles inside the propellant. Scanning electron microscopy images were used to analyze the effects of partial saturation on the inner structure. The results also suggested that the sorption time and desorption time played an important role in gas profile generation and controlled the inner structure of propellants.
NASA Astrophysics Data System (ADS)
Phillips, Michael G.
Human exposure to blast waves, including blast-induced traumatic brain injury, is a developing field in medical research. Experiments with explosives have many disadvantages including safety, cost, and required area for trials. Shock tubes provide an alternative method to produce free field blast wave profiles. A compressed nitrogen shock tube experiment instrumented with static and reflective pressure taps is modeled using a numerical simulation. The geometry of the numerical model is simplified and blast wave characteristics are derived based upon static and pressure profiles. The pressure profiles are analyzed along the shock tube centerline and radially away from the tube axis. The blast wave parameters found from the pressure profiles provide guidelines for spatial location of a specimen. The location could be based on multiple parameters and provides a distribution of anticipated pressure profiles experience by the specimen.
Intact Cell MALDI-TOF MS on Sperm: A Molecular Test For Male Fertility Diagnosis.
Soler, Laura; Labas, Valérie; Thélie, Aurore; Grasseau, Isabelle; Teixeira-Gomes, Ana-Paula; Blesbois, Elisabeth
2016-06-01
Currently, evaluation of sperm quality is primarily based on in vitro measures of sperm function such as motility, viability and/or acrosome reaction. However, results are often poorly correlated with fertility, and alternative diagnostic tools are therefore needed both in veterinary and human medicine. In a recent pilot study, we demonstrated that MS profiles from intact chicken sperm using MALDI-TOF profiles could detect significant differences between fertile/subfertile spermatozoa showing that such profiles could be useful for in vitro male fertility testing. In the present study, we performed larger standardized experimental procedures designed for the development of fertility- predictive mathematical models based on sperm cell MALDI-TOF MS profiles acquired through a fast, automated method. This intact cell MALDI-TOF MS-based method showed high diagnostic accuracy in identifying fertile/subfertile males in a large male population of known fertility from two distinct genetic lineages (meat and egg laying lines). We additionally identified 40% of the m/z peaks observed in sperm MS profiles through a top-down high-resolution protein identification analysis. This revealed that the MALDI-TOF MS spectra obtained from intact sperm cells contained a large proportion of protein degradation products, many implicated in important functional pathways in sperm such as energy metabolism, structure and movement. Proteins identified by our predictive model included diverse and important functional classes providing new insights into sperm function as it relates to fertility differences in this experimental system. Thus, in addition to the chicken model system developed here, with the use of appropriate models these methods should effectively translate to other animal taxa where similar tests for fertility are warranted. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Che, Yunfei; Ma, Shuqing; Xing, Fenghua; Li, Siteng; Dai, Yaru
2018-03-01
This paper focuses on an improvement of the retrieval of atmospheric temperature and relative humidity profiles through combining active and passive remote sensing. Ground-based microwave radiometer and millimeter-wavelength cloud radar were used to acquire the observations. Cloud base height and cloud thickness determinations from cloud radar were added into the atmospheric profile retrieval process, and a back-propagation neural network method was used as the retrieval tool. Because a substantial amount of data are required to train a neural network, and as microwave radiometer data are insufficient for this purpose, 8 years of radiosonde data from Beijing were used as the database. The monochromatic radiative transfer model was used to calculate the brightness temperatures in the same channels as the microwave radiometer. Parts of the cloud base heights and cloud thicknesses in the training data set were also estimated using the radiosonde data. The accuracy of the results was analyzed through a comparison with L-band sounding radar data and quantified using the mean bias, root-mean-square error (RMSE), and correlation coefficient. The statistical results showed that an inversion with cloud information was the optimal method. Compared with the inversion profiles without cloud information, the RMSE values after adding cloud information reduced to varying degrees for the vast majority of height layers. These reductions were particularly clear in layers with clouds. The maximum reduction in the RMSE for the temperature profile was 2.2 K, while that for the humidity profile was 16%.
Movassaghi, Masoud; Shabihkhani, Maryam; Hojat, Seyed A; Williams, Ryan R; Chung, Lawrance K; Im, Kyuseok; Lucey, Gregory M; Wei, Bowen; Mareninov, Sergey; Wang, Michael W; Ng, Denise W; Tashjian, Randy S; Magaki, Shino; Perez-Rosendahl, Mari; Yang, Isaac; Khanlou, Negar; Vinters, Harry V; Liau, Linda M; Nghiemphu, Phioanh L; Lai, Albert; Cloughesy, Timothy F; Yong, William H
2017-08-01
Commercial targeted genomic profiling with next generation sequencing using formalin-fixed paraffin embedded (FFPE) tissue has recently entered into clinical use for diagnosis and for the guiding of therapy. However, there is limited independent data regarding the accuracy or robustness of commercial genomic profiling in gliomas. As part of patient care, FFPE samples of gliomas from 71 patients were submitted for targeted genomic profiling to one commonly used commercial vendor, Foundation Medicine. Genomic alterations were determined for the following grades or groups of gliomas; Grade I/II, Grade III, primary glioblastomas (GBMs), recurrent primary GBMs, and secondary GBMs. In addition, FFPE samples from the same patients were independently assessed with conventional methods such as immunohistochemistry (IHC), Quantitative real-time PCR (qRT-PCR), or Fluorescence in situ hybridization (FISH) for three genetic alterations: IDH1 mutations, EGFR amplification, and EGFRvIII expression. A total of 100 altered genes were detected by the aforementioned targeted genomic profiling assay. The number of different genomic alterations was significantly different between the five groups of gliomas and consistent with the literature. CDKN2A/B, TP53, and TERT were the most common genomic alterations seen in primary GBMs, whereas IDH1, TP53, and PIK3CA were the most common in secondary GBMs. Targeted genomic profiling demonstrated 92.3%-100% concordance with conventional methods. The targeted genomic profiling report provided an average of 5.5 drugs, and listed an average of 8.4 clinical trials for the 71 glioma patients studied but only a third of the trials were appropriate for glioma patients. In this limited comparison study, this commercial next generation sequencing based-targeted genomic profiling showed a high concordance rate with conventional methods for the 3 genetic alterations and identified mutations expected for the type of glioma. While it may not be feasible to exhaustively independently validate a commercial genomic profiling assay, examination of a few markers provides some reassurance of its robustness. While potential targeted drugs are recommended based on genetic alterations, to date most targeted therapies have failed in glioblasomas so the usefulness of such recommendations will increase with development of novel and efficacious drugs. Copyright © 2017. Published by Elsevier Inc.
Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang
2016-01-01
Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.
Trunk density profile estimates from dual X-ray absorptiometry.
Wicke, Jason; Dumas, Geneviève A; Costigan, Patrick A
2008-01-01
Accurate body segment parameters are necessary to estimate joint loads when using biomechanical models. Geometric methods can provide individualized data for these models but the accuracy of the geometric methods depends on accurate segment density estimates. The trunk, which is important in many biomechanical models, has the largest variability in density along its length. Therefore, the objectives of this study were to: (1) develop a new method for modeling trunk density profiles based on dual X-ray absorptiometry (DXA) and (2) develop a trunk density function for college-aged females and males that can be used in geometric methods. To this end, the density profiles of 25 females and 24 males were determined by combining the measurements from a photogrammetric method and DXA readings. A discrete Fourier transformation was then used to develop the density functions for each sex. The individual density and average density profiles compare well with the literature. There were distinct differences between the profiles of two of participants (one female and one male), and the average for their sex. It is believed that the variations in these two participants' density profiles were a result of the amount and distribution of fat they possessed. Further studies are needed to support this possibility. The new density functions eliminate the uniform density assumption associated with some geometric models thus providing more accurate trunk segment parameter estimates. In turn, more accurate moments and forces can be estimated for the kinetic analyses of certain human movements.
A metabolomics-based method for studying the effect of yfcC gene in Escherichia coli on metabolism.
Wang, Xiyue; Xie, Yuping; Gao, Peng; Zhang, Sufang; Tan, Haidong; Yang, Fengxu; Lian, Rongwei; Tian, Jing; Xu, Guowang
2014-04-15
Metabolomics is a potent tool to assist in identifying the function of unknown genes through analysis of metabolite changes in the context of varied genetic backgrounds. However, the availability of a universal unbiased profiling analysis is still a big challenge. In this study, we report an optimized metabolic profiling method based on gas chromatography-mass spectrometry for Escherichia coli. It was found that physiological saline at -80°C could ensure satisfied metabolic quenching with less metabolite leakage. A solution of methanol/water (21:79, v/v) was proved to be efficient for intracellular metabolite extraction. This method was applied to investigate the metabolome difference among wild-type E. coli, its yfcC deletion, and overexpression mutants. Statistical and bioinformatic analysis of the metabolic profiling data indicated that the expression of yfcC potentially affected the metabolism of glyoxylate shunt. This finding was further validated by real-time quantitative polymerase chain reactions showing that expression of aceA and aceB, the key genes in glyoxylate shunt, was upregulated by yfcC. This study exemplifies the robustness of the proposed metabolic profiling analysis strategy and its potential roles in investigating unknown gene functions in view of metabolome difference. Copyright © 2014 Elsevier Inc. All rights reserved.
Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang
2017-03-01
This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Lin, Pei-Chun; Yu, Chun-Chang; Chen, Charlie Chung-Ping
2015-01-01
As one of the critical stages of a very large scale integration fabrication process, postexposure bake (PEB) plays a crucial role in determining the final three-dimensional (3-D) profiles and lessening the standing wave effects. However, the full 3-D chemically amplified resist simulation is not widely adopted during the postlayout optimization due to the long run-time and huge memory usage. An efficient simulation method is proposed to simulate the PEB while considering standing wave effects and resolution enhancement techniques, such as source mask optimization and subresolution assist features based on the Sylvester equation and Abbe-principal component analysis method. Simulation results show that our algorithm is 20× faster than the conventional Gaussian convolution method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hödemann, S., E-mail: siim.hodemann@ut.ee; Möls, P.; Kiisk, V.
2015-12-28
A new optical method is presented for evaluation of the stress profile in chemically tempered (chemically strengthened) glass based on confocal detection of scattered laser beam. Theoretically, a lateral resolution of 0.2 μm and a depth resolution of 0.6 μm could be achieved by using a confocal microscope with high-NA immersion objective. The stress profile in the 250 μm thick surface layer of chemically tempered lithium aluminosilicate glass was measured with a high spatial resolution to illustrate the capability of the method. The confocal method is validated using transmission photoelastic and Na{sup +} ion concentration profile measurement. Compositional influence on the stress-optic coefficientmore » is calculated and discussed. Our method opens up new possibilities for three-dimensional scattered light tomography of mechanical imaging in birefringent materials.« less
Electrolyte for EC-V profiling of InP and GaAs based structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faur, M.; Faur, M.; Goradia, M.
Electrochemical C-V (EC-V) profiling is the most often used and convenient method for accurate majority carrier concentration depth profiling of semiconductors. Although, according to the authors, FAP is the best electrolyte for accurate profiling of InP structures, it does not work well with other III-V compounds. To overcome this, recently, the authors have developed a new electrolyte, which they call UNIEL (UNIversal ELectrolyte), which works well with all the materials. However, as with the FAP electrolyte, the presence of HF makes the UNIEL incompatible with the electrochemical cell of Polaron EC-V profilers manufactured by BIO-RAD. By slightly modifying the electrochemicalmore » cell configuration the authors are able to use both the FAP and UNIEL electrolytes, without destroying the calomel electrode. Recently, they have, nevertheless, experimented with variations of the UNIEL with no HF content for EC-V profiling of structures based on InP and GaAs. Presently available results are presented here.« less
Modeling Electrokinetic Flows by the Smoothed Profile Method
Luo, Xian; Beskok, Ali; Karniadakis, George Em
2010-01-01
We propose an efficient modeling method for electrokinetic flows based on the Smoothed Profile Method (SPM) [1–4] and spectral element discretizations. The new method allows for arbitrary differences in the electrical conductivities between the charged surfaces and the the surrounding electrolyte solution. The electrokinetic forces are included into the flow equations so that the Poisson-Boltzmann and electric charge continuity equations are cast into forms suitable for SPM. The method is validated by benchmark problems of electroosmotic flow in straight channels and electrophoresis of charged cylinders. We also present simulation results of electrophoresis of charged microtubules, and show that the simulated electrophoretic mobility and anisotropy agree with the experimental values. PMID:20352076
Zhang, Li; Liu, Haiyu; Qin, Lingling; Zhang, Zhixin; Wang, Qing; Zhang, Qingqing; Lu, Zhiwei; Wei, Shengli; Gao, Xiaoyan; Tu, Pengfei
2015-02-01
A global chemical profiling based quality evaluation approach using ultra performance liquid chromatography with tandem quadrupole time-of-flight mass spectrometry was developed for the quality evaluation of three rhubarb species, including Rheum palmatum L., Rheum tanguticum Maxim. ex Balf., and Rheum officinale Baill. Considering that comprehensive detection of chemical components is crucial for the global profile, a systemic column performance evaluation method was developed. Based on this, a Cortecs column was used to acquire the chemical profile, and Chempattern software was employed to conduct similarity evaluation and hierarchical cluster analysis. The results showed R. tanguticum could be differentiated from R. palmatum and R. officinale at the similarity value 0.65, but R. palmatum and R. officinale could not be distinguished effectively. Therefore, a common pattern based on three rhubarb species was developed to conduct the quality evaluation, and the similarity value 0.50 was set as an appropriate threshold to control the quality of rhubarb. A total of 88 common peaks were identified by their accurate mass and fragmentation, and partially verified by reference standards. Through the verification, the newly developed method could be successfully used for evaluating the holistic quality of rhubarb. It would provide a reference for the quality control of other herbal medicines. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Discharge measurements at gaging stations
Turnipseed, D. Phil; Sauer, Vernon B.
2010-01-01
The techniques and standards for making discharge measurements at streamflow gaging stations are described in this publication. The vertical axis rotating-element current meter, principally the Price current meter, has been traditionally used for most measurements of discharge; however, advancements in acoustic technology have led to important developments in the use of acoustic Doppler current profilers, acoustic Doppler velocimeters, and other emerging technologies for the measurement of discharge. These new instruments, based on acoustic Doppler theory, have the advantage of no moving parts, and in the case of the acoustic Doppler current profiler, quickly and easily provide three-dimensional stream-velocity profile data through much of the vertical water column. For much of the discussion of acoustic Doppler current profiler moving-boat methodology, the reader is referred to U.S. Geological Survey Techniques and Methods 3-A22 (Mueller and Wagner, 2009). Personal digital assistants (PDAs), electronic field notebooks, and other personal computers provide fast and efficient data-collection methods that are more error-free than traditional hand methods. The use of portable weirs and flumes, floats, volumetric tanks, indirect methods, and tracers in measuring discharge are briefly described.
Zhang, Baixia; He, Shuaibing; Lv, Chenyang; Zhang, Yanling; Wang, Yun
2018-01-01
The identification of bioactive components in traditional Chinese medicine (TCM) is an important part of the TCM material foundation research. Recently, molecular docking technology has been extensively used for the identification of TCM bioactive components. However, target proteins that are used in molecular docking may not be the actual TCM target. For this reason, the bioactive components would likely be omitted or incorrect. To address this problem, this study proposed the GEPSI method that identified the target proteins of TCM based on the similarity of gene expression profiles. The similarity of the gene expression profiles affected by TCM and small molecular drugs was calculated. The pharmacological action of TCM may be similar to that of small molecule drugs that have a high similarity score. Indeed, the target proteins of the small molecule drugs could be considered TCM targets. Thus, we identified the bioactive components of a TCM by molecular docking and verified the reliability of this method by a literature investigation. Using the target proteins that TCM actually affected as targets, the identification of the bioactive components was more accurate. This study provides a fast and effective method for the identification of TCM bioactive components.
Zhang, Baixia; He, Shuaibing; Lv, Chenyang; Zhang, Yanling
2018-01-01
The identification of bioactive components in traditional Chinese medicine (TCM) is an important part of the TCM material foundation research. Recently, molecular docking technology has been extensively used for the identification of TCM bioactive components. However, target proteins that are used in molecular docking may not be the actual TCM target. For this reason, the bioactive components would likely be omitted or incorrect. To address this problem, this study proposed the GEPSI method that identified the target proteins of TCM based on the similarity of gene expression profiles. The similarity of the gene expression profiles affected by TCM and small molecular drugs was calculated. The pharmacological action of TCM may be similar to that of small molecule drugs that have a high similarity score. Indeed, the target proteins of the small molecule drugs could be considered TCM targets. Thus, we identified the bioactive components of a TCM by molecular docking and verified the reliability of this method by a literature investigation. Using the target proteins that TCM actually affected as targets, the identification of the bioactive components was more accurate. This study provides a fast and effective method for the identification of TCM bioactive components. PMID:29692857
Profiling and Quantification of Regioisomeric Caffeoyl Glucoses in Berry Fruits.
Patras, Maria A; Jaiswal, Rakesh; McDougall, Gordon J; Kuhnert, Nikolai
2018-02-07
On the basis of a recently developed tandem mass spectrometry-based hierarchical scheme for the identification of regioisomeric caffeoyl glucoses, selected berry fruits were profiled for their caffeoyl glucose ester content. Fresh edible berries profiled, including strawberries, raspberries, blueberries, blackberries, red currant, black currant, lingonberries, gooseberries, and juices of elderberries, goji berries, chokeberries, cranberries, açai berries, sea buckthorn berries, Montmorency sour cherries, and pomegranates, were investigated. 1-Caffeoyl glucose was found to be the predominant isomer in the majority of samples, with further profiling revealing the presence of additional hydroxycinnamoyl glucose esters and O-glycosides with p-coumaroyl, feruloyl, and sinapoyl substituents. A quantitative liquid chromatography-mass spectrometry-based method was developed and validated, and all caffeoyl glucose isomers were quantified for the first time in edible berries.
A ranking index for quality assessment of forensic DNA profiles forensic DNA profiles
2010-01-01
Background Assessment of DNA profile quality is vital in forensic DNA analysis, both in order to determine the evidentiary value of DNA results and to compare the performance of different DNA analysis protocols. Generally the quality assessment is performed through manual examination of the DNA profiles based on empirical knowledge, or by comparing the intensities (allelic peak heights) of the capillary electrophoresis electropherograms. Results We recently developed a ranking index for unbiased and quantitative quality assessment of forensic DNA profiles, the forensic DNA profile index (FI) (Hedman et al. Improved forensic DNA analysis through the use of alternative DNA polymerases and statistical modeling of DNA profiles, Biotechniques 47 (2009) 951-958). FI uses electropherogram data to combine the intensities of the allelic peaks with the balances within and between loci, using Principal Components Analysis. Here we present the construction of FI. We explain the mathematical and statistical methodologies used and present details about the applied data reduction method. Thereby we show how to adapt the ranking index for any Short Tandem Repeat-based forensic DNA typing system through validation against a manual grading scale and calibration against a specific set of DNA profiles. Conclusions The developed tool provides unbiased quality assessment of forensic DNA profiles. It can be applied for any DNA profiling system based on Short Tandem Repeat markers. Apart from crime related DNA analysis, FI can therefore be used as a quality tool in paternal or familial testing as well as in disaster victim identification. PMID:21062433
Unfolding sphere size distributions with a density estimator based on Tikhonov regularization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weese, J.; Korat, E.; Maier, D.
1997-12-01
This report proposes a method for unfolding sphere size distributions given a sample of radii that combines the advantages of a density estimator with those of Tikhonov regularization methods. The following topics are discusses in this report to achieve this method: the relation between the profile and the sphere size distribution; the method for unfolding sphere size distributions; the results based on simulations; and the experimental data comparison.
NASA Astrophysics Data System (ADS)
Zhao, G.; Zhao, C.
2016-12-01
Micro-pulse Lidar (MPL) measurements have been widely used to profile the ambient aerosol extincting coefficient(). Lidar Ratio (LR) ,which highly depends on the particle number size distribution (PNSD) and aerosol hygroscopicity, is the most important factor to retrieve the profile. A constant AOD constrained LR is usually used in current algorithms, which would lead to large bias when the relative humidity (RH) in the mixed layer is high. In this research, the influences of PNSD, aerosol hygroscopicity and RH profiles on the vertical variation of LR were investigated based on the datasets from field measurements in the North China Plain (NCP). Results show that LR can have an enhancement factor of more than 120% when the RH reaches to 92%. A new algorithm of retrieving the profile is proposed based on the variation of LR due to aerosol hygroscopicity. The magnitude and vertical structures of retrieved using this method can be significantly different to that of the fiexed LR method. The relative difference can reach up to 40% when the RH in the mixed layer is higher than 90% . Sensitivity studies show that RH profile and PNSD affect most on the retrieved by fiexed LR method. In view of this, a scheme of LR enhancement factor by RH is proposed in the NCP. The relative differnce of the calculated between using this scheme and the new algorithm with the variable LR can be less than 10%.
Ostrowski, Michalł; Wilkowska, Ewa; Baczek, Tomasz
2010-12-01
In vivo-in vitro correlation (IVIVC) is an effective tool to predict absorption behavior of active substances from pharmaceutical dosage forms. The model for immediate release dosage form containing amoxicillin was used in the presented study to check if the calculation method of absorption profiles can influence final results achieved. The comparison showed that an averaging of individual absorption profiles performed by Wagner-Nelson (WN) conversion method can lead to lose the discrimination properties of the model. The approach considering individual plasma concentration versus time profiles enabled to average absorption profiles prior WN conversion. In turn, that enabled to find differences between dispersible tablets and capsules. It was concluded that in the case of immediate release dosage form, the decision to use averaging method should be based on an individual situation; however, it seems that the influence of such a procedure on the discrimination properties of the model is then more significant. © 2010 Wiley-Liss, Inc. and the American Pharmacists Association
Cowan, Lauren S; Diem, Lois; Brake, Mary Catherine; Crawford, Jack T
2004-01-01
Spoligotyping using Luminex technology was shown to be a highly reproducible method suitable for high-throughput analysis. Spoligotyping of 48 isolates using the traditional membrane-based assay and the Luminex assay yielded concordant results for all isolates. The Luminex platform provides greater flexibility and cost effectiveness than the membrane-based assay.
Inter- and Intra-method Variability of VS Profiles and VS30 at ARRA-funded Sites
NASA Astrophysics Data System (ADS)
Yong, A.; Boatwright, J.; Martin, A. J.
2015-12-01
The 2009 American Recovery and Reinvestment Act (ARRA) funded geophysical site characterizations at 191 seismographic stations in California and in the central and eastern United States. Shallow boreholes were considered cost- and environmentally-prohibitive, thus non-invasive methods (passive and active surface- and body-wave techniques) were used at these stations. The drawback, however, is that these techniques measure seismic properties indirectly and introduce more uncertainty than borehole methods. The principal methods applied were Array Microtremor (AM), Multi-channel Analysis of Surface Waves (MASW; Rayleigh and Love waves), Spectral Analysis of Surface Waves (SASW), Refraction Microtremor (ReMi), and P- and S-wave refraction tomography. Depending on the apparent geologic or seismic complexity of the site, field crews applied one or a combination of these methods to estimate the shear-wave velocity (VS) profile and calculate VS30, the time-averaged VS to a depth of 30 meters. We study the inter- and intra-method variability of VS and VS30 at each seismographic station where combinations of techniques were applied. For each site, we find both types of variability in VS30 remain insignificant (5-10% difference) despite substantial variability observed in the VS profiles. We also find that reliable VS profiles are best developed using a combination of techniques, e.g., surface-wave VS profiles correlated against P-wave tomography to constrain variables (Poisson's ratio and density) that are key depth-dependent parameters used in modeling VS profiles. The most reliable results are based on surface- or body-wave profiles correlated against independent observations such as material properties inferred from outcropping geology nearby. For example, mapped geology describes station CI.LJR as a hard rock site (VS30 > 760 m/s). However, decomposed rock outcrops were found nearby and support the estimated VS30 of 303 m/s derived from the MASW (Love wave) profile.
Kück, Patrick; Meusemann, Karen; Dambach, Johannes; Thormann, Birthe; von Reumont, Björn M; Wägele, Johann W; Misof, Bernhard
2010-03-31
Methods of alignment masking, which refers to the technique of excluding alignment blocks prior to tree reconstructions, have been successful in improving the signal-to-noise ratio in sequence alignments. However, the lack of formally well defined methods to identify randomness in sequence alignments has prevented a routine application of alignment masking. In this study, we compared the effects on tree reconstructions of the most commonly used profiling method (GBLOCKS) which uses a predefined set of rules in combination with alignment masking, with a new profiling approach (ALISCORE) based on Monte Carlo resampling within a sliding window, using different data sets and alignment methods. While the GBLOCKS approach excludes variable sections above a certain threshold which choice is left arbitrary, the ALISCORE algorithm is free of a priori rating of parameter space and therefore more objective. ALISCORE was successfully extended to amino acids using a proportional model and empirical substitution matrices to score randomness in multiple sequence alignments. A complex bootstrap resampling leads to an even distribution of scores of randomly similar sequences to assess randomness of the observed sequence similarity. Testing performance on real data, both masking methods, GBLOCKS and ALISCORE, helped to improve tree resolution. The sliding window approach was less sensitive to different alignments of identical data sets and performed equally well on all data sets. Concurrently, ALISCORE is capable of dealing with different substitution patterns and heterogeneous base composition. ALISCORE and the most relaxed GBLOCKS gap parameter setting performed best on all data sets. Correspondingly, Neighbor-Net analyses showed the most decrease in conflict. Alignment masking improves signal-to-noise ratio in multiple sequence alignments prior to phylogenetic reconstruction. Given the robust performance of alignment profiling, alignment masking should routinely be used to improve tree reconstructions. Parametric methods of alignment profiling can be easily extended to more complex likelihood based models of sequence evolution which opens the possibility of further improvements.
Mantini, Dante; Petrucci, Francesca; Del Boccio, Piero; Pieragostino, Damiana; Di Nicola, Marta; Lugaresi, Alessandra; Federici, Giorgio; Sacchetta, Paolo; Di Ilio, Carmine; Urbani, Andrea
2008-01-01
Independent component analysis (ICA) is a signal processing technique that can be utilized to recover independent signals from a set of their linear mixtures. We propose ICA for the analysis of signals obtained from large proteomics investigations such as clinical multi-subject studies based on MALDI-TOF MS profiling. The method is validated on simulated and experimental data for demonstrating its capability of correctly extracting protein profiles from MALDI-TOF mass spectra. The comparison on peak detection with an open-source and two commercial methods shows its superior reliability in reducing the false discovery rate of protein peak masses. Moreover, the integration of ICA and statistical tests for detecting the differences in peak intensities between experimental groups allows to identify protein peaks that could be indicators of a diseased state. This data-driven approach demonstrates to be a promising tool for biomarker-discovery studies based on MALDI-TOF MS technology. The MATLAB implementation of the method described in the article and both simulated and experimental data are freely available at http://www.unich.it/proteomica/bioinf/.
Numerical restoration of surface vortices in Nb films measured by a scanning SQUID microscope
NASA Astrophysics Data System (ADS)
Ito, Atsuki; Thanh Huy, Ho; Dang, Vu The; Miyoshi, Hiroki; Hayashi, Masahiko; Ishida, Takekazu
2017-07-01
In the present work, we investigated a vortex profile appeared on a pure Nb film (500 nm in thickness, 10 mm x 10 mm) by using a scanning SQUID microscope. We found that the local magnetic distribution thus observed is broadened compared to a true vortex profile in the superconducting film. We therefore applied the numerical method to improve a spatial resolution of the scanning SQUID microscope. The method is based on the inverse Biot-Savart law and the Fourier transformation to recover a real-space image. We found that the numerical analyses give a smaller vortex than the raw vortex profile observed by the scanning microscope.
Derivation of the physical parameters for strong and weak flares from the Hα line
NASA Astrophysics Data System (ADS)
Semeida, M. A.; Rashed, M. G.
2016-06-01
The two flares of 19 and 30 July 1999 were observed in the Hα line using the multichannel flare spectrograph (MFS) at the Astronomical Institute in Ondřejov, Czech Republic. We use a modified cloud method to fit the Hα line profiles which avoids using the background profile. We obtain the four parameters of the two flares: the source function, the optical thickness at line center, the line-of-sight velocity and the Doppler width. The observed asymmetry profiles have been reproduced by the theoretical ones based on our model. A discussion is made about the results of strong and weak flares using the present method.
Automated prediction of protein function and detection of functional sites from structure.
Pazos, Florencio; Sternberg, Michael J E
2004-10-12
Current structural genomics projects are yielding structures for proteins whose functions are unknown. Accordingly, there is a pressing requirement for computational methods for function prediction. Here we present PHUNCTIONER, an automatic method for structure-based function prediction using automatically extracted functional sites (residues associated to functions). The method relates proteins with the same function through structural alignments and extracts 3D profiles of conserved residues. Functional features to train the method are extracted from the Gene Ontology (GO) database. The method extracts these features from the entire GO hierarchy and hence is applicable across the whole range of function specificity. 3D profiles associated with 121 GO annotations were extracted. We tested the power of the method both for the prediction of function and for the extraction of functional sites. The success of function prediction by our method was compared with the standard homology-based method. In the zone of low sequence similarity (approximately 15%), our method assigns the correct GO annotation in 90% of the protein structures considered, approximately 20% higher than inheritance of function from the closest homologue.
Detecting the Water-soluble Chloride Distribution of Cement Paste in a High-precision Way.
Chang, Honglei; Mu, Song
2017-11-21
To improve the accuracy of the chloride distribution along the depth of cement paste under cyclic wet-dry conditions, a new method is proposed to obtain a high-precision chloride profile. Firstly, paste specimens are molded, cured, and exposed to cyclic wet-dry conditions. Then, powder samples at different specimen depths are grinded when the exposure age is reached. Finally, the water-soluble chloride content is detected using a silver nitrate titration method, and chloride profiles are plotted. The key to improving the accuracy of the chloride distribution along the depth is to exclude the error in the powderization, which is the most critical step for testing the distribution of chloride. Based on the above concept, the grinding method in this protocol can be used to grind powder samples automatically layer by layer from the surface inward, and it should be noted that a very thin grinding thickness (less than 0.5 mm) with a minimum error less than 0.04 mm can be obtained. The chloride profile obtained by this method better reflects the chloride distribution in specimens, which helps researchers to capture the distribution features that are often overlooked. Furthermore, this method can be applied to studies in the field of cement-based materials, which require high chloride distribution accuracy.
Characterisation of group behaviour surface texturing with multi-layers fitting method
NASA Astrophysics Data System (ADS)
Kang, Zhengyang; Fu, Yonghong; Ji, Jinghu; Wang, Hao
2016-07-01
Surface texturing was widely applied in improving the tribological properties of mechanical components, but study of measurement of this technology was still insufficient. This study proposed the multi-layers fitting (MLF) method to characterise the dimples array texture surface. Based on the synergistic effect among the dimples, the 3D morphology of texture surface was rebuilt by 2D stylus profiler in the MLF method. The feasible regions of texture patterns and sensitive parameters were confirmed by non-linear programming, and the processing software of MLF method was developed based on the Matlab®. The characterisation parameters system of dimples was defined mathematically, and the accuracy of MLF method was investigated by comparison experiment. The surface texture specimens were made by laser surface texturing technology, in which high consistency of dimples' size and distribution was achieved. Then, 2D profiles of different dimples were captured by employing Hommel-T1000 stylus profiler, and the data were further processed by MLF software to rebuild 3D morphology of single dimple. The experiment results indicated that the MLF characterisation results were similar to those of Wyko T1100, the white light interference microscope. It was also found that the stability of MLF characterisation results highly depended on the number of captured cross-sections.
An efficient solid modeling system based on a hand-held 3D laser scan device
NASA Astrophysics Data System (ADS)
Xiong, Hanwei; Xu, Jun; Xu, Chenxi; Pan, Ming
2014-12-01
The hand-held 3D laser scanner sold in the market is appealing for its port and convenient to use, but price is expensive. To develop such a system based cheap devices using the same principles as the commercial systems is impossible. In this paper, a simple hand-held 3D laser scanner is developed based on a volume reconstruction method using cheap devices. Unlike convenient laser scanner to collect point cloud of an object surface, the proposed method only scan few key profile curves on the surface. Planar section curve network can be generated from these profile curves to construct a volume model of the object. The details of design are presented, and illustrated by the example of a complex shaped object.
Virtual environment assessment for laser-based vision surface profiling
NASA Astrophysics Data System (ADS)
ElSoussi, Adnane; Al Alami, Abed ElRahman; Abu-Nabah, Bassam A.
2015-03-01
Oil and gas businesses have been raising the demand from original equipment manufacturers (OEMs) to implement a reliable metrology method in assessing surface profiles of welds before and after grinding. This certainly mandates the deviation from the commonly used surface measurement gauges, which are not only operator dependent, but also limited to discrete measurements along the weld. Due to its potential accuracy and speed, the use of laser-based vision surface profiling systems have been progressively rising as part of manufacturing quality control. This effort presents a virtual environment that lends itself for developing and evaluating existing laser vision sensor (LVS) calibration and measurement techniques. A combination of two known calibration techniques is implemented to deliver a calibrated LVS system. System calibration is implemented virtually and experimentally to scan simulated and 3D printed features of known profiles, respectively. Scanned data is inverted and compared with the input profiles to validate the virtual environment capability for LVS surface profiling and preliminary assess the measurement technique for weld profiling applications. Moreover, this effort brings 3D scanning capability a step closer towards robust quality control applications in a manufacturing environment.
Wan, B; Yarbrough, J W; Schultz, T W
2008-01-01
This study was undertaken to test the hypothesis that structurally similar PAHs induce similar gene expression profiles. THP-1 cells were exposed to a series of 12 selected PAHs at 50 microM for 24 hours and gene expressions profiles were analyzed using both unsupervised and supervised methods. Clustering analysis of gene expression profiles revealed that the 12 tested chemicals were grouped into five clusters. Within each cluster, the gene expression profiles are more similar to each other than to the ones outside the cluster. One-methylanthracene and 1-methylfluorene were found to have the most similar profiles; dibenzothiophene and dibenzofuran were found to share common profiles with fluorine. As expression pattern comparisons were expanded, similarity in genomic fingerprint dropped off dramatically. Prediction analysis of microarrays (PAM) based on the clustering pattern generated 49 predictor genes that can be used for sample discrimination. Moreover, a significant analysis of Microarrays (SAM) identified 598 genes being modulated by tested chemicals with a variety of biological processes, such as cell cycle, metabolism, and protein binding and KEGG pathways being significantly (p < 0.05) affected. It is feasible to distinguish structurally different PAHs based on their genomic fingerprints, which are mechanism based.
Epstein, Leonard H; Finkelstein, Eric A; Katz, David L; Jankowiak, Noelle; Pudlewski, Corrin; Paluch, Rocco A
2016-08-01
The goal of the present study was to apply experimental economic methods in an online supermarket to examine the effects of nutrient profiling, and differential pricing based on the nutrient profile, on the overall diet quality, energy and macronutrients of the foods purchased, and diet cost. Participants were provided nutrient profiling scores or price adjustments based on nutrient profile scores while completing a hypothetical grocery shopping task. Prices of foods in the top 20 % of nutrient profiling scores were reduced (subsidized) by 25 % while those in the bottom 20 % of scores were increased (taxed) by 25 %. We evaluated the independent and interactive effects of nutrient profiling or price adjustments on overall diet quality of foods purchased as assessed by the NuVal® score, energy and macronutrients purchased and diet cost in a 2×2 factorial design. A large (>10 000 food items) online experimental supermarket in the USA. Seven hundred and eighty-one women. Providing nutrient profiling scores improved overall diet quality of foods purchased. Price changes were associated with an increase in protein purchased, an increase in energy cost, and reduced carbohydrate and protein costs. Price changes and nutrient profiling combined were associated with no unique benefits beyond price changes or nutrient profiling alone. Providing nutrient profile score increased overall NuVal® score without a reduction in energy purchased. Combining nutrient profiling and price changes did not show an overall benefit to diet quality and may be less useful than nutrient profiling alone to consumers who want to increase overall diet quality of foods purchased.
Phillips, Jeffrey D.
2018-01-10
PDEPTH is an interactive, graphical computer program used to construct interpreted geological source models for observed potential-field geophysical profile data. The current version of PDEPTH has been adapted to the Windows platform from an earlier DOS-based version. The input total-field magnetic anomaly and vertical gravity anomaly profiles can be filtered to produce derivative products such as reduced-to-pole magnetic profiles, pseudogravity profiles, pseudomagnetic profiles, and upward-or-downward-continued profiles. A variety of source-location methods can be applied to the original and filtered profiles to estimate (and display on a cross section) the locations and physical properties of contacts, sheet edges, horizontal line sources, point sources, and interface surfaces. Two-and-a-half-dimensional source bodies having polygonal cross sections can be constructed using a mouse and keyboard. These bodies can then be adjusted until the calculated gravity and magnetic fields of the source bodies are close to the observed profiles. Auxiliary information such as the topographic surface, bathymetric surface, seismic basement, and geologic contact locations can be displayed on the cross section using optional input files. Test data files, used to demonstrate the source location methods in the report, and several utility programs are included.
Lung tumor diagnosis and subtype discovery by gene expression profiling.
Wang, Lu-yong; Tu, Zhuowen
2006-01-01
The optimal treatment of patients with complex diseases, such as cancers, depends on the accurate diagnosis by using a combination of clinical and histopathological data. In many scenarios, it becomes tremendously difficult because of the limitations in clinical presentation and histopathology. To accurate diagnose complex diseases, the molecular classification based on gene or protein expression profiles are indispensable for modern medicine. Moreover, many heterogeneous diseases consist of various potential subtypes in molecular basis and differ remarkably in their response to therapies. It is critical to accurate predict subgroup on disease gene expression profiles. More fundamental knowledge of the molecular basis and classification of disease could aid in the prediction of patient outcome, the informed selection of therapies, and identification of novel molecular targets for therapy. In this paper, we propose a new disease diagnostic method, probabilistic boosting tree (PB tree) method, on gene expression profiles of lung tumors. It enables accurate disease classification and subtype discovery in disease. It automatically constructs a tree in which each node combines a number of weak classifiers into a strong classifier. Also, subtype discovery is naturally embedded in the learning process. Our algorithm achieves excellent diagnostic performance, and meanwhile it is capable of detecting the disease subtype based on gene expression profile.
Modelling gene expression profiles related to prostate tumor progression using binary states
2013-01-01
Background Cancer is a complex disease commonly characterized by the disrupted activity of several cancer-related genes such as oncogenes and tumor-suppressor genes. Previous studies suggest that the process of tumor progression to malignancy is dynamic and can be traced by changes in gene expression. Despite the enormous efforts made for differential expression detection and biomarker discovery, few methods have been designed to model the gene expression level to tumor stage during malignancy progression. Such models could help us understand the dynamics and simplify or reveal the complexity of tumor progression. Methods We have modeled an on-off state of gene activation per sample then per stage to select gene expression profiles associated to tumor progression. The selection is guided by statistical significance of profiles based on random permutated datasets. Results We show that our method identifies expected profiles corresponding to oncogenes and tumor suppressor genes in a prostate tumor progression dataset. Comparisons with other methods support our findings and indicate that a considerable proportion of significant profiles is not found by other statistical tests commonly used to detect differential expression between tumor stages nor found by other tailored methods. Ontology and pathway analysis concurred with these findings. Conclusions Results suggest that our methodology may be a valuable tool to study tumor malignancy progression, which might reveal novel cancer therapies. PMID:23721350
NASA Astrophysics Data System (ADS)
Hein, Annette; Larsen, Jakob Juul; Parsekian, Andrew D.
2017-02-01
Surface nuclear magnetic resonance (NMR) is a unique geophysical method due to its direct sensitivity to water. A key limitation to overcome is the difficulty of making surface NMR measurements in environments with anthropogenic electromagnetic noise, particularly constant frequency sources such as powerlines. Here we present a method of removing harmonic noise by utilizing frequency domain symmetry of surface NMR signals to reconstruct portions of the spectrum corrupted by frequency-domain noise peaks. This method supplements the existing NMR processing workflow and is applicable after despiking, coherent noise cancellation, and stacking. The symmetry based correction is simple, grounded in mathematical theory describing NMR signals, does not introduce errors into the data set, and requires no prior knowledge about the harmonics. Modelling and field examples show that symmetry based noise removal reduces the effects of harmonics. In one modelling example, symmetry based noise removal improved signal-to-noise ratio in the data by 10 per cent. This improvement had noticeable effects on inversion parameters including water content and the decay constant T2*. Within water content profiles, aquifer boundaries and water content are more accurate after harmonics are removed. Fewer spurious water content spikes appear within aquifers, which is especially useful for resolving multilayered structures. Within T2* profiles, estimates are more accurate after harmonics are removed, especially in the lower half of profiles.
NASA Astrophysics Data System (ADS)
Gu, Myojeong; Enell, Carl-Fredrik; Hendrick, François; Pukite, Janis; Van Roozendael, Michel; Platt, Ulrich; Raffalski, Uwe; Wagner, Thomas
2015-04-01
Stratospheric NO2 not only destroys ozone but acts as a buffer against halogen catalyzed ozone loss by converting halogen species into stable nitrates. These two roles of stratospheric NO2 depend on the altitude. Hence, the objective of this study is to investigate the vertical distribution of stratospheric NO2. We compare the NO2 profiles derived from the zenith sky DOAS with those obtained from, SAOZ balloon measurements and satellite limb observations. Vertical profiles of stratospheric NO2 are retrieved from ground-based zenith sky DOAS observations operated at Kiruna, Sweden (68.84°N, 20.41°E) since 1996. To determine the profile of stratospheric NO2 measured from ground-based zenith sky DOAS, we apply the Optimal Estimation Method (OEM) to retrieval of vertical profiles of stratospheric NO2 which has been developed by IASB-BIRA. The basic principle behind this profiling approach is the dependence of the mean scattering height on solar zenith angle (SZA). We compare the retrieved profiles to two additional datasets of stratospheric NO2 profile. The first one is derived from satellite limb observations by SCIAMACHY (Scanning Imaging Absorption spectrometer for Atmospheric CHartographY) on EnviSAT. The second is derived from the SAOZ balloon measurements (using a UV/Visible spectrometer) performed at Kiruna in Sweden.
Unsupervised User Similarity Mining in GSM Sensor Networks
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining. PMID:23576905
Brian C. Dietterick; Russell White; Ryan Hilburn
2012-01-01
Airborne Light Detection and Ranging (LiDAR) holds promise to provide an alternative to traditional ground-based survey methods for stream channel characterization and some change detection purposes, even under challenging landscape conditions. This study compared channel characteristics measured at 53 ground-surveyed and LiDAR-derived crosssectional profiles located...
The Infant Motor Profile: A Standardized and Qualitative Method to Assess Motor Behaviour in Infancy
ERIC Educational Resources Information Center
Heineman, Kirsten R.; Bos, Arend F.; Hadders-Algra, Mijna
2008-01-01
A reliable and valid instrument to assess neuromotor condition in infancy is a prerequisite for early detection of developmental motor disorders. We developed a video-based assessment of motor behaviour, the Infant Motor Profile (IMP), to evaluate motor abilities, movement variability, ability to select motor strategies, movement symmetry, and…
ERIC Educational Resources Information Center
Vannucci, Anna; Tanofsky-Kraff, Marian; Crosby, Ross D.; Ranzenhofer, Lisa M.; Shomaker, Lauren B.; Field, Sara E.; Mooreville, Mira; Reina, Samantha A.; Kozlosky, Merel; Yanovski, Susan Z.; Yanovski, Jack A.
2013-01-01
Objective: We used latent profile analysis (LPA) to classify children and adolescents into subtypes based on the overlap of disinhibited eating behaviors--eating in the absence of hunger, emotional eating, and subjective and objective binge eating. Method: Participants were 411 youths (8-18 years) from the community who reported on their…
A high-throughput assay for the comprehensive profiling of DNA ligase fidelity
Lohman, Gregory J. S.; Bauer, Robert J.; Nichols, Nicole M.; Mazzola, Laurie; Bybee, Joanna; Rivizzigno, Danielle; Cantin, Elizabeth; Evans, Thomas C.
2016-01-01
DNA ligases have broad application in molecular biology, from traditional cloning methods to modern synthetic biology and molecular diagnostics protocols. Ligation-based detection of polynucleotide sequences can be achieved by the ligation of probe oligonucleotides when annealed to a complementary target sequence. In order to achieve a high sensitivity and low background, the ligase must efficiently join correctly base-paired substrates, while discriminating against the ligation of substrates containing even one mismatched base pair. In the current study, we report the use of capillary electrophoresis to rapidly generate mismatch fidelity profiles that interrogate all 256 possible base-pair combinations at a ligation junction in a single experiment. Rapid screening of ligase fidelity in a 96-well plate format has allowed the study of ligase fidelity in unprecedented depth. As an example of this new method, herein we report the ligation fidelity of Thermus thermophilus DNA ligase at a range of temperatures, buffer pH and monovalent cation strength. This screen allows the selection of reaction conditions that maximize fidelity without sacrificing activity, while generating a profile of specific mismatches that ligate detectably under each set of conditions. PMID:26365241
A high-throughput assay for the comprehensive profiling of DNA ligase fidelity.
Lohman, Gregory J S; Bauer, Robert J; Nichols, Nicole M; Mazzola, Laurie; Bybee, Joanna; Rivizzigno, Danielle; Cantin, Elizabeth; Evans, Thomas C
2016-01-29
DNA ligases have broad application in molecular biology, from traditional cloning methods to modern synthetic biology and molecular diagnostics protocols. Ligation-based detection of polynucleotide sequences can be achieved by the ligation of probe oligonucleotides when annealed to a complementary target sequence. In order to achieve a high sensitivity and low background, the ligase must efficiently join correctly base-paired substrates, while discriminating against the ligation of substrates containing even one mismatched base pair. In the current study, we report the use of capillary electrophoresis to rapidly generate mismatch fidelity profiles that interrogate all 256 possible base-pair combinations at a ligation junction in a single experiment. Rapid screening of ligase fidelity in a 96-well plate format has allowed the study of ligase fidelity in unprecedented depth. As an example of this new method, herein we report the ligation fidelity of Thermus thermophilus DNA ligase at a range of temperatures, buffer pH and monovalent cation strength. This screen allows the selection of reaction conditions that maximize fidelity without sacrificing activity, while generating a profile of specific mismatches that ligate detectably under each set of conditions. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
A study on the application of Fourier series in IMRT treatment planning.
Almeida-Trinidad, R; Garnica-Garza, H M
2007-12-01
In intensity-modulated radiotherapy, a set of x-ray fluence profiles is iteratively adjusted until a desired absorbed dose distribution is obtained. The purpose of this article is to present a method that allows the optimization of fluence profiles based on the Fourier series decomposition of an initial approximation to the profile. The method has the advantage that a new fluence profile can be obtained in a precise and controlled way with the tuning of only two parameters, namely the phase of the sine and cosine terms of one of the Fourier components, in contrast to the point-by-point tuning of the profile. Also, because the method uses analytical functions, the resultant profiles do not exhibit numerical artifacts. A test case consisting of a mathematical phantom with a target wrapped around a critical structure is discussed to illustrate the algorithm. It is shown that the degree of conformality of the absorbed dose distribution can be tailored by varying the number of Fourier terms made available to the optimization algorithm. For the test case discussed here, it is shown that the number of Fourier terms to be modified depends on the number of radiation beams incident on the target but it is in general in the order of 10 terms.
NASA Astrophysics Data System (ADS)
Kalvin, Alan D.
2002-06-01
The importance of using perceptual colormaps for visualizing numerical data is well established in the fields of scientific visualization, computer graphics and color science and related areas of research. In practice however, the use of perceptual colormaps tends to be the exception rather than the rule. In general it is difficult for end-users to find suitable colormaps. In addition, even when such colormaps are available, the inherent variability in color reproduction among computer displays makes it very difficult for the users to verify that these colormaps do indeed preserve their perceptual characteristics when used on different displays. Generally, verification requires display profiling (evaluating the display's color reproduction characteristics), using a colorimeter or a similar type of measuring device. With the growth of the Internet, and the resulting proliferation of remote, client-based displays, the profiling problem has become even more difficult, and in many cases, impossible. We present a method for enumerating and generating perceptual colormaps in such a way that ensures that the perceptual characteristics of the colormaps are maintained for over a wide range of different displays. This method constructs colormaps that are guaranteed to be 'perceptually correct' for a given display by using whatever partial profile information of the display is available. We use the term 'graduated profiling' to describe this method of partial profiling.
HMM-ModE: implementation, benchmarking and validation with HMMER3
2014-01-01
Background HMM-ModE is a computational method that generates family specific profile HMMs using negative training sequences. The method optimizes the discrimination threshold using 10 fold cross validation and modifies the emission probabilities of profiles to reduce common fold based signals shared with other sub-families. The protocol depends on the program HMMER for HMM profile building and sequence database searching. The recent release of HMMER3 has improved database search speed by several orders of magnitude, allowing for the large scale deployment of the method in sequence annotation projects. We have rewritten our existing scripts both at the level of parsing the HMM profiles and modifying emission probabilities to upgrade HMM-ModE using HMMER3 that takes advantage of its probabilistic inference with high computational speed. The method is benchmarked and tested on GPCR dataset as an accurate and fast method for functional annotation. Results The implementation of this method, which now works with HMMER3, is benchmarked with the earlier version of HMMER, to show that the effect of local-local alignments is marked only in the case of profiles containing a large number of discontinuous match states. The method is tested on a gold standard set of families and we have reported a significant reduction in the number of false positive hits over the default HMM profiles. When implemented on GPCR sequences, the results showed an improvement in the accuracy of classification compared with other methods used to classify the familyat different levels of their classification hierarchy. Conclusions The present findings show that the new version of HMM-ModE is a highly specific method used to differentiate between fold (superfamily) and function (family) specific signals, which helps in the functional annotation of protein sequences. The use of modified profile HMMs of GPCR sequences provides a simple yet highly specific method for classification of the family, being able to predict the sub-family specific sequences with high accuracy even though sequences share common physicochemical characteristics between sub-families. PMID:25073805
Abbiati, Milena; Baroffio, Anne; Gerbase, Margaret W.
2016-01-01
Introduction A consistent body of literature highlights the importance of a broader approach to select medical school candidates both assessing cognitive capacity and individual characteristics. However, selection in a great number of medical schools worldwide is still based on knowledge exams, a procedure that might neglect students with needed personal characteristics for future medical practice. We investigated whether the personal profile of students selected through a knowledge-based exam differed from those not selected. Methods Students applying for medical school (N=311) completed questionnaires assessing motivations for becoming a doctor, learning approaches, personality traits, empathy, and coping styles. Selection was based on the results of MCQ tests. Principal component analysis was used to draw a profile of the students. Differences between selected and non-selected students were examined by Multivariate ANOVAs, and their impact on selection by logistic regression analysis. Results Students demonstrating a profile of diligence with higher conscientiousness, deep learning approach, and task-focused coping were more frequently selected (p=0.01). Other personal characteristics such as motivation, sociability, and empathy did not significantly differ, comparing selected and non-selected students. Conclusion Selection through a knowledge-based exam privileged diligent students. It did neither advantage nor preclude candidates with a more humane profile. PMID:27079886
Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei
2017-05-01
A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.
Burki, Umar; Straub, Volker
2017-01-01
Determining the concentration of oligonucleotide in biological samples such as tissue lysate and serum is essential for determining the biodistribution and pharmacokinetic profile, respectively. ELISA-based assays have shown far greater sensitivities compared to other methods such as HPLC and LC/MS. Here, we describe a novel ultrasensitive hybridization-based ELISA method for quantitating morpholino oligonucleotides in mouse tissue lysate and serum samples. The assay has a linear detection range of 5-250 pM (R2 > 0.99).
A chemogenomic analysis of the human proteome: application to enzyme families.
Bernasconi, Paul; Chen, Min; Galasinski, Scott; Popa-Burke, Ioana; Bobasheva, Anna; Coudurier, Louis; Birkos, Steve; Hallam, Rhonda; Janzen, William P
2007-10-01
Sequence-based phylogenies (SBP) are well-established tools for describing relationships between proteins. They have been used extensively to predict the behavior and sensitivity toward inhibitors of enzymes within a family. The utility of this approach diminishes when comparing proteins with little sequence homology. Even within an enzyme family, SBPs must be complemented by an orthogonal method that is independent of sequence to better predict enzymatic behavior. A chemogenomic approach is demonstrated here that uses the inhibition profile of a 130,000 diverse molecule library to uncover relationships within a set of enzymes. The profile is used to construct a semimetric additive distance matrix. This matrix, in turn, defines a sequence-independent phylogeny (SIP). The method was applied to 97 enzymes (kinases, proteases, and phosphatases). SIP does not use structural information from the molecules used for establishing the profile, thus providing a more heuristic method than the current approaches, which require knowledge of the specific inhibitor's structure. Within enzyme families, SIP shows a good overall correlation with SBP. More interestingly, SIP uncovers distances within families that are not recognizable by sequence-based methods. In addition, SIP allows the determination of distance between enzymes with no sequence homology, thus uncovering novel relationships not predicted by SBP. This chemogenomic approach, used in conjunction with SBP, should prove to be a powerful tool for choosing target combinations for drug discovery programs as well as for guiding the selection of profiling and liability targets.
NASA Astrophysics Data System (ADS)
Gierens, Rosa T.; Henriksson, Svante; Josipovic, Micky; Vakkari, Ville; van Zyl, Pieter G.; Beukes, Johan P.; Wood, Curtis R.; O'Connor, Ewan J.
2018-05-01
The atmospheric boundary layer (BL) is the atmospheric layer coupled to the Earth's surface at relatively short timescales. A key quantity is the BL depth, which is important in many applied areas of weather and climate such as air-quality forecasting. Studying BLs in climates and biomes across the globe is important, particularly in the under-sampled southern hemisphere. The present study is based on a grazed grassland-savannah area in northwestern South Africa during October 2012-August 2014. Ceilometers are probably the cheapest method for measuring continuous aerosol profiles up to several kilometers above ground and are thus an ideal tool for long-term studies of BLs. A ceilometer-estimated BL depth is based on profiles of attenuated backscattering coefficients from atmospheric aerosols; the sharpest drop often occurs at BL top. Based on this, we developed a new method for layer detection that we call the signal-limited layer method. The new algorithm was applied to ceilometer profiles which thus classified BL into classic regime types: daytime convective mixing, and a double layer at night of surface-based stable with a residual layer above it. We employed wavelet fitting to increase successful BL estimation for noisy profiles. The layer-detection algorithm was supported by an eddy-flux station, rain gauges, and manual inspection. Diurnal cycles were often clear, with BL depth detected for 50% of the daytime typically being 1-3 km, and for 80% of the night-time typically being a few hundred meters. Variability was also analyzed with respect to seasons and years. Finally, BL depths were compared with ERA-Interim estimates of BL depth to show reassuring agreement.
Reconstruction of SAXS Profiles from Protein Structures
Putnam, Daniel K.; Lowe, Edward W.
2013-01-01
Small angle X-ray scattering (SAXS) is used for low resolution structural characterization of proteins often in combination with other experimental techniques. After briefly reviewing the theory of SAXS we discuss computational methods based on 1) the Debye equation and 2) Spherical Harmonics to compute intensity profiles from a particular macromolecular structure. Further, we review how these formulas are parameterized for solvent density and hydration shell adjustment. Finally we introduce our solution to compute SAXS profiles utilizing GPU acceleration. PMID:24688746
Antimicrobial breakpoint estimation accounting for variability in pharmacokinetics.
Bi, Goue Denis Gohore; Li, Jun; Nekka, Fahima
2009-06-26
Pharmacokinetic and pharmacodynamic (PK/PD) indices are increasingly being used in the microbiological field to assess the efficacy of a dosing regimen. In contrast to methods using MIC, PK/PD-based methods reflect in vivo conditions and are more predictive of efficacy. Unfortunately, they entail the use of one PK-derived value such as AUC or Cmax and may thus lead to biased efficiency information when the variability is large. The aim of the present work was to evaluate the efficacy of a treatment by adjusting classical breakpoint estimation methods to the situation of variable PK profiles. We propose a logical generalisation of the usual AUC methods by introducing the concept of "efficiency" for a PK profile, which involves the efficacy function as a weight. We formulated these methods for both classes of concentration- and time-dependent antibiotics. Using drug models and in silico approaches, we provide a theoretical basis for characterizing the efficiency of a PK profile under in vivo conditions. We also used the particular case of variable drug intake to assess the effect of the variable PK profiles generated and to analyse the implications for breakpoint estimation. Compared to traditional methods, our weighted AUC approach gives a more powerful PK/PD link and reveals, through examples, interesting issues about the uniqueness of therapeutic outcome indices and antibiotic resistance problems.
Rapin, Nicolas; Bagger, Frederik Otzen; Jendholm, Johan; Mora-Jensen, Helena; Krogh, Anders; Kohlmann, Alexander; Thiede, Christian; Borregaard, Niels; Bullinger, Lars; Winther, Ole; Theilgaard-Mönch, Kim; Porse, Bo T
2014-02-06
Gene expression profiling has been used extensively to characterize cancer, identify novel subtypes, and improve patient stratification. However, it has largely failed to identify transcriptional programs that differ between cancer and corresponding normal cells and has not been efficient in identifying expression changes fundamental to disease etiology. Here we present a method that facilitates the comparison of any cancer sample to its nearest normal cellular counterpart, using acute myeloid leukemia (AML) as a model. We first generated a gene expression-based landscape of the normal hematopoietic hierarchy, using expression profiles from normal stem/progenitor cells, and next mapped the AML patient samples to this landscape. This allowed us to identify the closest normal counterpart of individual AML samples and determine gene expression changes between cancer and normal. We find the cancer vs normal method (CvN method) to be superior to conventional methods in stratifying AML patients with aberrant karyotype and in identifying common aberrant transcriptional programs with potential importance for AML etiology. Moreover, the CvN method uncovered a novel poor-outcome subtype of normal-karyotype AML, which allowed for the generation of a highly prognostic survival signature. Collectively, our CvN method holds great potential as a tool for the analysis of gene expression profiles of cancer patients.
Performance evaluation of DNA copy number segmentation methods.
Pierre-Jean, Morgane; Rigaill, Guillem; Neuvial, Pierre
2015-07-01
A number of bioinformatic or biostatistical methods are available for analyzing DNA copy number profiles measured from microarray or sequencing technologies. In the absence of rich enough gold standard data sets, the performance of these methods is generally assessed using unrealistic simulation studies, or based on small real data analyses. To make an objective and reproducible performance assessment, we have designed and implemented a framework to generate realistic DNA copy number profiles of cancer samples with known truth. These profiles are generated by resampling publicly available SNP microarray data from genomic regions with known copy-number state. The original data have been extracted from dilutions series of tumor cell lines with matched blood samples at several concentrations. Therefore, the signal-to-noise ratio of the generated profiles can be controlled through the (known) percentage of tumor cells in the sample. This article describes this framework and its application to a comparison study between methods for segmenting DNA copy number profiles from SNP microarrays. This study indicates that no single method is uniformly better than all others. It also helps identifying pros and cons of the compared methods as a function of biologically informative parameters, such as the fraction of tumor cells in the sample and the proportion of heterozygous markers. This comparison study may be reproduced using the open source and cross-platform R package jointseg, which implements the proposed data generation and evaluation framework: http://r-forge.r-project.org/R/?group_id=1562. © The Author 2014. Published by Oxford University Press.
432- μm laser's beam-waist measurement for the polarimeter/interferometer on the EAST tokamak
NASA Astrophysics Data System (ADS)
Wang, Z. X.; Liu, H. Q.; Jie, Y. X.; Wu, M. Q.; Lan, T.; Zhu, X.; Zou, Z. Y.; Yang, Y.; Wei, X. C.; Zeng, L.; Li, G. S.; Gao, X.
2014-10-01
A far-infrared (FIR) polarimeter/interferometer (PI) system is under development for measurements of the current-density and the electron-density profiles in the EAST tokamak. The system will utilize three identical 432- μm CHCOOH lasers pumped by a CO2 laser. Measurements of the laser beam's waist size and position are basic works. This paper will introduce three methods with a beam profiler and several focusing optical elements. The beam profiler can be used to show the spatial energy distribution of the laser beam. The active area of the profiler is 12.4 × 12.4 mm2. Some focusing optical elements are needed to focus the beam in order for the beam profiler to receive the entire laser beam. Two principles and three methods are used in the measurement. The first and the third methods are based on the same principle, and the second method adopts an other principle. Due to the fast and convenient measurement, although the first method is a special form of the third and it can only give the size of beam waist, it is essential to the development of the experiment and it can provide guidance for the choices of the sizes of the optical elements in the next step. A concave mirror, a high-density polyethylene (HDPE) lens and a polymethylpentene (TPX) lens are each used in the measurement process. The results of these methods are close enough for the design of PI system's optical path.
Zhou, Wei; Song, Xiang-gang; Chen, Chao; Wang, Shu-mei; Liang, Sheng-wang
2015-08-01
Action mechanism and material base of compound Danshen dripping pills in treatment of carotid atherosclerosis were discussed based on gene expression profile and molecular fingerprint in this paper. First, gene expression profiles of atherosclerotic carotid artery tissues and histologically normal tissues in human body were collected, and were screened using significance analysis of microarray (SAM) to screen out differential gene expressions; then differential genes were analyzed by Gene Ontology (GO) analysis and KEGG pathway analysis; to avoid some genes with non-outstanding differential expression but biologically importance, Gene Set Enrichment Analysis (GSEA) were performed, and 7 chemical ingredients with higher negative enrichment score were obtained by Cmap method, implying that they could reversely regulate the gene expression profiles of pathological tissues; and last, based on the hypotheses that similar structures have similar activities, 336 ingredients of compound Danshen dripping pills were compared with 7 drug molecules in 2D molecular fingerprints method. The results showed that 147 differential genes including 60 up-regulated genes and 87 down regulated genes were screened out by SAM. And in GO analysis, Biological Process ( BP) is mainly concerned with biological adhesion, response to wounding and inflammatory response; Cellular Component (CC) is mainly concerned with extracellular region, extracellular space and plasma membrane; while Molecular Function (MF) is mainly concerned with antigen binding, metalloendopeptidase activity and peptide binding. KEGG pathway analysis is mainly concerned with JAK-STAT, RIG-I like receptor and PPAR signaling pathway. There were 10 compounds, such as hexadecane, with Tanimoto coefficients greater than 0.85, which implied that they may be the active ingredients (AIs) of compound Danshen dripping pills in treatment of carotid atherosclerosis (CAs). The present method can be applied to the research on material base and molecular action mechanism of TCM.
Handwriting Examination: Moving from Art to Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.
In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less
High-resolution mapping of transcription factor binding sites on native chromatin
Kasinathan, Sivakanthan; Orsi, Guillermo A.; Zentner, Gabriel E.; Ahmad, Kami; Henikoff, Steven
2014-01-01
Sequence-specific DNA-binding proteins including transcription factors (TFs) are key determinants of gene regulation and chromatin architecture. Formaldehyde cross-linking and sonication followed by Chromatin ImmunoPrecipitation (X-ChIP) is widely used for profiling of TF binding, but is limited by low resolution and poor specificity and sensitivity. We present a simple protocol that starts with micrococcal nuclease-digested uncross-linked chromatin and is followed by affinity purification of TFs and paired-end sequencing. The resulting ORGANIC (Occupied Regions of Genomes from Affinity-purified Naturally Isolated Chromatin) profiles of Saccharomyces cerevisiae Abf1 and Reb1 provide highly accurate base-pair resolution maps that are not biased toward accessible chromatin, and do not require input normalization. We also demonstrate the high specificity of our method when applied to larger genomes by profiling Drosophila melanogaster GAGA Factor and Pipsqueak. Our results suggest that ORGANIC profiling is a widely applicable high-resolution method for sensitive and specific profiling of direct protein-DNA interactions. PMID:24336359
Cowan, Lauren S.; Diem, Lois; Brake, Mary Catherine; Crawford, Jack T.
2004-01-01
Spoligotyping using Luminex technology was shown to be a highly reproducible method suitable for high-throughput analysis. Spoligotyping of 48 isolates using the traditional membrane-based assay and the Luminex assay yielded concordant results for all isolates. The Luminex platform provides greater flexibility and cost effectiveness than the membrane-based assay. PMID:14715809
Doppler Lidar Measurements of Tropospheric Wind Profiles Using the Aerosol Double Edge Technique
NASA Technical Reports Server (NTRS)
Gentry, Bruce M.; Li, Steven X.; Mathur, Savyasachee; Korb, C. Laurence; Chen, Huailin
2000-01-01
The development of a ground based direct detection Doppler lidar based on the recently described aerosol double edge technique is reported. A pulsed, injection seeded Nd:YAG laser operating at 1064 nm is used to make range resolved measurements of atmospheric winds in the free troposphere. The wind measurements are determined by measuring the Doppler shift of the laser signal backscattered from atmospheric aerosols. The lidar instrument and double edge method are described and initial tropospheric wind profile measurements are presented. Wind profiles are reported for both day and night operation. The measurements extend to altitudes as high as 14 km and are compared to rawinsonde wind profile data from Dulles airport in Virginia. Vertical resolution of the lidar measurements is 330 m and the rms precision of the measurements is a low as 0.6 m/s.
Continuous Water Vapor Profiles from Operational Ground-Based Active and Passive Remote Sensors
NASA Technical Reports Server (NTRS)
Turner, D. D.; Feltz, W. F.; Ferrare, R. A.
2000-01-01
The Atmospheric Radiation Measurement program's Southern Great Plains Cloud and Radiation Testbed site central facility near Lamont, Oklahoma, offers unique operational water vapor profiling capabilities, including active and passive remote sensors as well as traditional in situ radiosonde measurements. Remote sensing technologies include an automated Raman lidar and an automated Atmospheric Emitted Radiance Interferometer (AERI), which are able to retrieve water vapor profiles operationally through the lower troposphere throughout the diurnal cycle. Comparisons of these two water vapor remote sensing methods to each other and to radiosondes over an 8-month period are presented and discussed, highlighting the accuracy and limitations of each method. Additionally, the AERI is able to retrieve profiles of temperature while the Raman lidar is able to retrieve aerosol extinction profiles operationally. These data, coupled with hourly wind profiles from a 915-MHz wind profiler, provide complete specification of the state of the atmosphere in noncloudy skies. Several case studies illustrate the utility of these high temporal resolution measurements in the characterization of mesoscale features within a 3-day time period in which passage of a dryline, warm air advection, and cold front occurred.
WE-AB-207A-12: HLCC Based Quantitative Evaluation Method of Image Artifact in Dental CBCT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Y; Wu, S; Qi, H
Purpose: Image artifacts are usually evaluated qualitatively via visual observation of the reconstructed images, which is susceptible to subjective factors due to the lack of an objective evaluation criterion. In this work, we propose a Helgason-Ludwig consistency condition (HLCC) based evaluation method to quantify the severity level of different image artifacts in dental CBCT. Methods: Our evaluation method consists of four step: 1) Acquire Cone beam CT(CBCT) projection; 2) Convert 3D CBCT projection to fan-beam projection by extracting its central plane projection; 3) Convert fan-beam projection to parallel-beam projection utilizing sinogram-based rebinning algorithm or detail-based rebinning algorithm; 4) Obtain HLCCmore » profile by integrating parallel-beam projection per view and calculate wave percentage and variance of the HLCC profile, which can be used to describe the severity level of image artifacts. Results: Several sets of dental CBCT projections containing only one type of artifact (i.e. geometry, scatter, beam hardening, lag and noise artifact), were simulated using gDRR, a GPU tool developed for efficient, accurate, and realistic simulation of CBCT Projections. These simulated CBCT projections were used to test our proposed method. HLCC profile wave percentage and variance induced by geometry distortion are about 3∼21 times and 16∼393 times as large as that of the artifact-free projection, respectively. The increase factor of wave percentage and variance are 6 and133 times for beam hardening, 19 and 1184 times for scatter, and 4 and16 times for lag artifacts, respectively. In contrast, for noisy projection the wave percentage, variance and inconsistency level are almost the same with those of the noise-free one. Conclusion: We have proposed a quantitative evaluation method of image artifact based on HLCC theory. According to our simulation results, the severity of different artifact types is found to be in a following order: Scatter>Geometry>Beam hardening>Lag>Noise>Artifact-free in dental CBCT.« less
Determining protein function and interaction from genome analysis
Eisenberg, David; Marcotte, Edward M.; Thompson, Michael J.; Pellegrini, Matteo; Yeates, Todd O.
2004-08-03
A computational method system, and computer program are provided for inferring functional links from genome sequences. One method is based on the observation that some pairs of proteins A' and B' have homologs in another organism fused into a single protein chain AB. A trans-genome comparison of sequences can reveal these AB sequences, which are Rosetta Stone sequences because they decipher an interaction between A' and B. Another method compares the genomic sequence of two or more organisms to create a phylogenetic profile for each protein indicating its presence or absence across all the genomes. The profile provides information regarding functional links between different families of proteins. In yet another method a combination of the above two methods is used to predict functional links.
Zappalà, G; Motta, V; Tuccitto, N; Vitale, S; Torrisi, A; Licciardello, A
2015-12-15
Secondary ion mass spectrometry (SIMS) with polyatomic primary ions provides a successful tool for molecular depth profiling of polymer systems, relevant in many technological applications. Widespread C60 sources, however, cause in some polymers extensive damage with loss of molecular information along depth. We study a method, based on the use of a radical scavenger, for inhibiting ion-beam-induced reactions causing sample damage. Layered polystyrene sulfonate and polyacrylic acid based polyelectrolyte films, behaving differently towards C60 beam-induced damage, were selected and prepared as model systems. They were depth profiled by means of time-of-flight (TOF)-SIMS in dual beam mode, using fullerene ions for sputtering. Nitric oxide was introduced into the analysis chamber as a radical scavenger. The effect of sample cooling combined with NO-dosing on the quality of depth profiles was explored. NO-dosing during C60-SIMS depth profiling of >1 micrometer-thick multilayered polyelectrolytes allows detection, along depth, of characteristic fragments from systems otherwise damaged by C60 bombardment, and increases sputtering yield by more than one order of magnitude. By contrast, NO has little influence on those layers that are well profiled with C60 alone. Such leveling effect, more pronounced at low temperature, leads to a dramatic improvement of profile quality, with a clear definition of interfaces. NO-dosing provides a tool for extending the applicability, in SIMS depth profiling, of the widely spread fullerene ion sources. In view of the acceptable erosion rates on inorganics, obtainable with C60, the method could be of relevance also in connection with the 3D-imaging of hybrid polymer/inorganic systems. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Yunita; Galinium, M.; Lukas
2017-01-01
New product development in real estate industry is a challenging process since it is related to long term concept and high cost. A newly proposed product development should meet customer need and their preferences which appropriate with customer buying power and company value. This research use data mining for profiling customer transaction and Analytic Hierarchy Process (AHP) method for product selection in new product development. This research utilizes Weka as data mining open source software to profiling data customers. The analysis correlated product preferences and profiling demography such as city, age, gender and occupation. Demography profiles gives description buying power and product preferences. The products proposed are based on customer profiles and rank of the product by AHP method. The product with the highest score will be proposed as new product development. Case studies of this research are real estate projects in Serang, Makassar, and Balikpapan. Makassar and Balikpapan are the project that already gained success and Serang is new project which new products development will be proposed to launch. Based on profiling and product preference of customer in Balikpapan, Makassar, and prospectus of Serang markets, new products development that will be proposed are house type of 120/200 m2 with price around Rp1.300.000.000 and house type of 71/120 m2 with price around Rp800.000.000. The markets of Serang and Balikpapan have similarities in profiles as urban city so the new products development will adopt the succeed story of Balikpapan project.
Analysis of high-throughput biological data using their rank values.
Dembélé, Doulaye
2018-01-01
High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .
Airborne Doppler Wind Lidar Post Data Processing Software DAPS-LV
NASA Technical Reports Server (NTRS)
Kavaya, Michael J. (Inventor); Beyon, Jeffrey Y. (Inventor); Koch, Grady J. (Inventor)
2015-01-01
Systems, methods, and devices of the present invention enable post processing of airborne Doppler wind LIDAR data. In an embodiment, airborne Doppler wind LIDAR data software written in LabVIEW may be provided and may run two versions of different airborne wind profiling algorithms. A first algorithm may be the Airborne Wind Profiling Algorithm for Doppler Wind LIDAR ("APOLO") using airborne wind LIDAR data from two orthogonal directions to estimate wind parameters, and a second algorithm may be a five direction based method using pseudo inverse functions to estimate wind parameters. The various embodiments may enable wind profiles to be compared using different algorithms, may enable wind profile data for long haul color displays to be generated, may display long haul color displays, and/or may enable archiving of data at user-selectable altitudes over a long observation period for data distribution and population.
HHsvm: fast and accurate classification of profile–profile matches identified by HHsearch
Dlakić, Mensur
2009-01-01
Motivation: Recently developed profile–profile methods rival structural comparisons in their ability to detect homology between distantly related proteins. Despite this tremendous progress, many genuine relationships between protein families cannot be recognized as comparisons of their profiles result in scores that are statistically insignificant. Results: Using known evolutionary relationships among protein superfamilies in SCOP database, support vector machines were trained on four sets of discriminatory features derived from the output of HHsearch. Upon validation, it was shown that the automatic classification of all profile–profile matches was superior to fixed threshold-based annotation in terms of sensitivity and specificity. The effectiveness of this approach was demonstrated by annotating several domains of unknown function from the Pfam database. Availability: Programs and scripts implementing the methods described in this manuscript are freely available from http://hhsvm.dlakiclab.org/. Contact: mdlakic@montana.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19773335
STEM-based science learning implementation to identify student’s personal intelligences profiles
NASA Astrophysics Data System (ADS)
Wiguna, B. J. P. K.; Suwarma, I. R.; Liliawati, W.
2018-05-01
Science and technology are rapidly developing needs to be balanced with the human resources that have the qualified ability. Not only cognitive ability, but also have the soft skills that support 21st century skills. Science, Technology, Engineering, and Mathematics (STEM) Education is a solution to improve the quality of learning and prepare students may be able to trained 21st century skills. This study aims to analyse the implementation of STEM-based science learning on Newton’s law of motion by identifying the personal intelligences profile junior high school students. The method used in this research is pre experiment with the design of the study one group pre-test post-test. Samples in this study were 26 junior high school students taken using Convenience Sampling. Students personal intelligences profile after learning STEM-based science uses two instruments, self-assessment and peer assessment. Intrapersonal intelligence profile based self-assessment and peer assessment are respectively 69.38; and 64.08. As for interpersonal intelligence for self-assessment instrument is 73 and the peer assessment is 60.23.
Eisenberg, David; Marcotte, Edward M.; Pellegrini, Matteo; Thompson, Michael J.; Yeates, Todd O.
2002-10-15
A computational method system, and computer program are provided for inferring functional links from genome sequences. One method is based on the observation that some pairs of proteins A' and B' have homologs in another organism fused into a single protein chain AB. A trans-genome comparison of sequences can reveal these AB sequences, which are Rosetta Stone sequences because they decipher an interaction between A' and B. Another method compares the genomic sequence of two or more organisms to create a phylogenetic profile for each protein indicating its presence or absence across all the genomes. The profile provides information regarding functional links between different families of proteins. In yet another method a combination of the above two methods is used to predict functional links.
Katagiri, Fumiaki; Glazebrook, Jane
2003-01-01
A major task in computational analysis of mRNA expression profiles is definition of relationships among profiles on the basis of similarities among them. This is generally achieved by pattern recognition in the distribution of data points representing each profile in a high-dimensional space. Some drawbacks of commonly used pattern recognition algorithms stem from their use of a globally linear space and/or limited degrees of freedom. A pattern recognition method called Local Context Finder (LCF) is described here. LCF uses nonlinear dimensionality reduction for pattern recognition. Then it builds a network of profiles based on the nonlinear dimensionality reduction results. LCF was used to analyze mRNA expression profiles of the plant host Arabidopsis interacting with the bacterial pathogen Pseudomonas syringae. In one case, LCF revealed two dimensions essential to explain the effects of the NahG transgene and the ndr1 mutation on resistant and susceptible responses. In another case, plant mutants deficient in responses to pathogen infection were classified on the basis of LCF analysis of their profiles. The classification by LCF was consistent with the results of biological characterization of the mutants. Thus, LCF is a powerful method for extracting information from expression profile data. PMID:12960373
Antimicrobial breakpoint estimation accounting for variability in pharmacokinetics
Bi, Goue Denis Gohore; Li, Jun; Nekka, Fahima
2009-01-01
Background Pharmacokinetic and pharmacodynamic (PK/PD) indices are increasingly being used in the microbiological field to assess the efficacy of a dosing regimen. In contrast to methods using MIC, PK/PD-based methods reflect in vivo conditions and are more predictive of efficacy. Unfortunately, they entail the use of one PK-derived value such as AUC or Cmax and may thus lead to biased efficiency information when the variability is large. The aim of the present work was to evaluate the efficacy of a treatment by adjusting classical breakpoint estimation methods to the situation of variable PK profiles. Methods and results We propose a logical generalisation of the usual AUC methods by introducing the concept of "efficiency" for a PK profile, which involves the efficacy function as a weight. We formulated these methods for both classes of concentration- and time-dependent antibiotics. Using drug models and in silico approaches, we provide a theoretical basis for characterizing the efficiency of a PK profile under in vivo conditions. We also used the particular case of variable drug intake to assess the effect of the variable PK profiles generated and to analyse the implications for breakpoint estimation. Conclusion Compared to traditional methods, our weighted AUC approach gives a more powerful PK/PD link and reveals, through examples, interesting issues about the uniqueness of therapeutic outcome indices and antibiotic resistance problems. PMID:19558679
Zhang, Shu-Dong; Gant, Timothy W
2009-07-31
Connectivity mapping is a process to recognize novel pharmacological and toxicological properties in small molecules by comparing their gene expression signatures with others in a database. A simple and robust method for connectivity mapping with increased specificity and sensitivity was recently developed, and its utility demonstrated using experimentally derived gene signatures. This paper introduces sscMap (statistically significant connections' map), a Java application designed to undertake connectivity mapping tasks using the recently published method. The software is bundled with a default collection of reference gene-expression profiles based on the publicly available dataset from the Broad Institute Connectivity Map 02, which includes data from over 7000 Affymetrix microarrays, for over 1000 small-molecule compounds, and 6100 treatment instances in 5 human cell lines. In addition, the application allows users to add their custom collections of reference profiles and is applicable to a wide range of other 'omics technologies. The utility of sscMap is two fold. First, it serves to make statistically significant connections between a user-supplied gene signature and the 6100 core reference profiles based on the Broad Institute expanded dataset. Second, it allows users to apply the same improved method to custom-built reference profiles which can be added to the database for future referencing. The software can be freely downloaded from http://purl.oclc.org/NET/sscMap.
Lee, So-Yeon; Ha, Eun-Ju; Woo, Seung-Kyun; Lee, So-Min; Lim, Kyung-Hee; Eom, Yong-Bin
2017-07-01
Telogen hairs presented in the crime scene are commonly encountered as trace evidence. However, short tandem repeat (STR) profiling of the hairs currently have low and limited use due to poor success rate. To increase the success rate of STR profiling of telogen hairs, we developed a rapid and cost-effective method to estimate the number of nuclei in the hair roots. Five cationic dyes, Methyl green (MG), Harris hematoxylin (HH), Methylene blue (MB), Toluidine blue (TB), and Safranin O (SO) were evaluated in this study. We conducted a screening test based on microscopy and the percentage of loss with nuclear DNA, in order to select the best dye. MG was selected based on its specific nuclei staining and low adverse effect on the hair-associated nuclear DNA. We examined 330 scalp and 100 pubic telogen hairs with MG. Stained hairs were classified into five groups and analyzed by STR. The fast staining method revealed 70% (head hair) and 33.4% (pubic hair) of full (30 alleles) and high partial (18-29 alleles) STR profiling proportion from the lowest nuclei count group (one to ten nuclei). The results of this study demonstrated a rapid, specific, nondestructive, and high yield DNA profiling method applicable for screening telogen hairs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bernardes, Juliana; Zaverucha, Gerson; Vaquero, Catherine; Carbone, Alessandra
2016-01-01
Traditional protein annotation methods describe known domains with probabilistic models representing consensus among homologous domain sequences. However, when relevant signals become too weak to be identified by a global consensus, attempts for annotation fail. Here we address the fundamental question of domain identification for highly divergent proteins. By using high performance computing, we demonstrate that the limits of state-of-the-art annotation methods can be bypassed. We design a new strategy based on the observation that many structural and functional protein constraints are not globally conserved through all species but might be locally conserved in separate clades. We propose a novel exploitation of the large amount of data available: 1. for each known protein domain, several probabilistic clade-centered models are constructed from a large and differentiated panel of homologous sequences, 2. a decision-making protocol combines outcomes obtained from multiple models, 3. a multi-criteria optimization algorithm finds the most likely protein architecture. The method is evaluated for domain and architecture prediction over several datasets and statistical testing hypotheses. Its performance is compared against HMMScan and HHblits, two widely used search methods based on sequence-profile and profile-profile comparison. Due to their closeness to actual protein sequences, clade-centered models are shown to be more specific and functionally predictive than the broadly used consensus models. Based on them, we improved annotation of Plasmodium falciparum protein sequences on a scale not previously possible. We successfully predict at least one domain for 72% of P. falciparum proteins against 63% achieved previously, corresponding to 30% of improvement over the total number of Pfam domain predictions on the whole genome. The method is applicable to any genome and opens new avenues to tackle evolutionary questions such as the reconstruction of ancient domain duplications, the reconstruction of the history of protein architectures, and the estimation of protein domain age. Website and software: http://www.lcqb.upmc.fr/CLADE. PMID:27472895
An Optimal Estimation Method to Obtain Surface Layer Turbulent Fluxes from Profile Measurements
NASA Astrophysics Data System (ADS)
Kang, D.
2015-12-01
In the absence of direct turbulence measurements, the turbulence characteristics of the atmospheric surface layer are often derived from measurements of the surface layer mean properties based on Monin-Obukhov Similarity Theory (MOST). This approach requires two levels of the ensemble mean wind, temperature, and water vapor, from which the fluxes of momentum, sensible heat, and water vapor can be obtained. When only one measurement level is available, the roughness heights and the assumed properties of the corresponding variables at the respective roughness heights are used. In practice, the temporal mean with large number of samples are used in place of the ensemble mean. However, in many situations the samples of data are taken from multiple levels. It is thus desirable to derive the boundary layer flux properties using all measurements. In this study, we used an optimal estimation approach to derive surface layer properties based on all available measurements. This approach assumes that the samples are taken from a population whose ensemble mean profile follows the MOST. An optimized estimate is obtained when the results yield a minimum cost function defined as a weighted summation of all error variance at each sample altitude. The weights are based one sample data variance and the altitude of the measurements. This method was applied to measurements in the marine atmospheric surface layer from a small boat using radiosonde on a tethered balloon where temperature and relative humidity profiles in the lowest 50 m were made repeatedly in about 30 minutes. We will present the resultant fluxes and the derived MOST mean profiles using different sets of measurements. The advantage of this method over the 'traditional' methods will be illustrated. Some limitations of this optimization method will also be discussed. Its application to quantify the effects of marine surface layer environment on radar and communication signal propagation will be shown as well.
Tan, Yen Hock; Huang, He; Kihara, Daisuke
2006-08-15
Aligning distantly related protein sequences is a long-standing problem in bioinformatics, and a key for successful protein structure prediction. Its importance is increasing recently in the context of structural genomics projects because more and more experimentally solved structures are available as templates for protein structure modeling. Toward this end, recent structure prediction methods employ profile-profile alignments, and various ways of aligning two profiles have been developed. More fundamentally, a better amino acid similarity matrix can improve a profile itself; thereby resulting in more accurate profile-profile alignments. Here we have developed novel amino acid similarity matrices from knowledge-based amino acid contact potentials. Contact potentials are used because the contact propensity to the other amino acids would be one of the most conserved features of each position of a protein structure. The derived amino acid similarity matrices are tested on benchmark alignments at three different levels, namely, the family, the superfamily, and the fold level. Compared to BLOSUM45 and the other existing matrices, the contact potential-based matrices perform comparably in the family level alignments, but clearly outperform in the fold level alignments. The contact potential-based matrices perform even better when suboptimal alignments are considered. Comparing the matrices themselves with each other revealed that the contact potential-based matrices are very different from BLOSUM45 and the other matrices, indicating that they are located in a different basin in the amino acid similarity matrix space.
Novel method to sample very high power CO2 lasers: II Continuing Studies
NASA Astrophysics Data System (ADS)
Eric, John; Seibert, Daniel B., II; Green, Lawrence I.
2005-04-01
For the past 28 years, the Laser Hardened Materials Evaluation Laboratory (LHMEL) at the Wright-Patterson Air Force Base, OH, has worked with CO2 lasers capable of producing continuous energy up to 150 kW. These lasers are used in a number of advanced materials processing applications that require accurate spatial energy measurements of the laser. Conventional non-electronic methods are not satisfactory for determining the spatial energy profile. This paper describes continuing efforts in qualifying the new method in which a continuous, real-time electronic spatial energy profile can be obtained for very high power, (VHP) CO2 lasers.
Piao, Yongjun; Piao, Minghao; Ryu, Keun Ho
2017-01-01
Cancer classification has been a crucial topic of research in cancer treatment. In the last decade, messenger RNA (mRNA) expression profiles have been widely used to classify different types of cancers. With the discovery of a new class of small non-coding RNAs; known as microRNAs (miRNAs), various studies have shown that the expression patterns of miRNA can also accurately classify human cancers. Therefore, there is a great demand for the development of machine learning approaches to accurately classify various types of cancers using miRNA expression data. In this article, we propose a feature subset-based ensemble method in which each model is learned from a different projection of the original feature space to classify multiple cancers. In our method, the feature relevance and redundancy are considered to generate multiple feature subsets, the base classifiers are learned from each independent miRNA subset, and the average posterior probability is used to combine the base classifiers. To test the performance of our method, we used bead-based and sequence-based miRNA expression datasets and conducted 10-fold and leave-one-out cross validations. The experimental results show that the proposed method yields good results and has higher prediction accuracy than popular ensemble methods. The Java program and source code of the proposed method and the datasets in the experiments are freely available at https://sourceforge.net/projects/mirna-ensemble/. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhao, Yuancun; Chen, Xiaogang; Yang, Yiwen; Zhao, Xiaohong; Zhang, Shu; Gao, Zehua; Fang, Ting; Wang, Yufang; Zhang, Ji
2018-05-07
Diatom examination has always been used for the diagnosis of drowning in forensic practice. However, traditional examination of the microscopic features of diatom frustules is time-consuming and requires taxonomic expertise. In this study, we demonstrate a potential DNA-based method of inferring suspected drowning site using pyrosequencing (PSQ) of the V7 region of 18S ribosome DNA (18S rDNA) as a diatom DNA barcode. By employing a sparse representation-based AdvISER-M-PYRO algorithm, the original PSQ signals of diatom DNA mixtures were deciphered to determine the corresponding taxa of the composite diatoms. Additionally, we evaluated the possibility of correlating water samples to collection sites by analyzing the PSQ signal profiles of diatom mixtures contained in the water samples via multidimensional scaling. The results suggest that diatomaceous PSQ profile analysis could be used as a cost-effective method to deduce the geographical origin of an environmental bio-sample.
Molecular filter-based diagnostics in high speed flows
NASA Technical Reports Server (NTRS)
Elliott, Gregory S.; Samimy, MO; Arnette, Stephen A.
1993-01-01
The use of iodine molecular filters in nonintrusive planar velocimetry methods is examined. Detailed absorption profiles are obtained to highlight the effects that determine the profile shape. It is shown that pressure broadening induced by the presence of a nonabsorbing vapor can be utilized to significantly change the slopes bounding the absorbing region while remaining in the optically-thick regime.
Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems
Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda
2015-01-01
In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477
Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.
Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda
2015-01-01
In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.
Permeability profiles in granular aquifers using flowmeters in direct-push wells.
Paradis, Daniel; Lefebvre, René; Morin, Roger H; Gloaguen, Erwan
2011-01-01
Numerical hydrogeological models should ideally be based on the spatial distribution of hydraulic conductivity (K), a property rarely defined on the basis of sufficient data due to the lack of efficient characterization methods. Electromagnetic borehole flowmeter measurements during pumping in uncased wells can effectively provide a continuous vertical distribution of K in consolidated rocks. However, relatively few studies have used the flowmeter in screened wells penetrating unconsolidated aquifers, and tests conducted in gravel-packed wells have shown that flowmeter data may yield misleading results. This paper describes the practical application of flowmeter profiles in direct-push wells to measure K and delineate hydrofacies in heterogeneous unconsolidated aquifers having low-to-moderate K (10(-6) to 10(-4) m/s). The effect of direct-push well installation on K measurements in unconsolidated deposits is first assessed based on the previous work indicating that such installations minimize disturbance to the aquifer fabric. The installation and development of long-screen wells are then used in a case study validating K profiles from flowmeter tests at high-resolution intervals (15 cm) with K profiles derived from multilevel slug tests between packers at identical intervals. For 119 intervals tested in five different wells, the difference in log K values obtained from the two methods is consistently below 10%. Finally, a graphical approach to the interpretation of flowmeter profiles is proposed to delineate intervals corresponding to distinct hydrofacies, thus providing a method whereby both the scale and magnitude of K contrasts in heterogeneous unconsolidated aquifers may be represented. Journal compilation © 2010 National Ground Water Association. No claim to original US government works.
Liu, Xin
2014-01-01
This study describes a deterministic method for simulating the first-order scattering in a medical computed tomography scanner. The method was developed based on a physics model of x-ray photon interactions with matter and a ray tracing technique. The results from simulated scattering were compared to the ones from an actual scattering measurement. Two phantoms with homogeneous and heterogeneous material distributions were used in the scattering simulation and measurement. It was found that the simulated scatter profile was in agreement with the measurement result, with an average difference of 25% or less. Finally, tomographic images with artifacts caused by scatter were corrected based on the simulated scatter profiles. The image quality improved significantly.
Ncube, Efficient N; Mhlongo, Msizi I; Piater, Lizelle A; Steenkamp, Paul A; Dubery, Ian A; Madala, Ntakadzeni E
2014-01-01
Chlorogenic acids (CGAs) are a class of phytochemicals that are formed as esters between different derivatives of cinnamic acid and quinic acid molecules. In plants, accumulation of these compounds has been linked to several physiological responses against various stress factors; however, biochemical synthesis differs from one plant to another. Although structurally simple, the analysis of CGA molecules with modern analytical platforms poses an analytical challenge. The objective of the study was to perform a comparison of the CGA profiles and related derivatives from differentiated tobacco leaf tissues and undifferentiated cell suspension cultures. Using an UHPLC-Q-TOF-MS/MS fingerprinting method based on the in-source collision induced dissociation (ISCID) approach, a total of 19 different metabolites with a cinnamic acid core moiety were identified. These metabolites were either present in both leaf tissue and cell suspension samples or in only one of the two plant systems. Profile differences point to underlying biochemical similarities or differences thereof. Using this method, the regio- and geometric-isomer profiles of chlorogenic acids of the two tissue types of Nicotiana tabacum were achieved. The method was also shown to be applicable for the detection of other related molecules containing a cinnamic acid core.
Assessment of geothermal energy potential by geophysical methods: Nevşehir Region, Central Anatolia
NASA Astrophysics Data System (ADS)
Kıyak, Alper; Karavul, Can; Gülen, Levent; Pekşen, Ertan; Kılıç, A. Rıza
2015-03-01
In this study, geothermal potential of the Nevşehir region (Central Anatolia) was assessed by using vertical electrical sounding (VES), self-potential (SP), magnetotelluric (MT), gravity and gravity 3D Euler deconvolution structure analysis methods. Extensive volcanic activity occurred in this region from Upper Miocene to Holocene time. Due to the young volcanic activity Nevşehir region can be viewed as a potential geothermal area. We collected data from 54 VES points along 5 profiles, from 28 MT measurement points along 2 profiles (at frequency range between 320 and 0.0001 Hz), and from 4 SP profiles (total 19 km long). The obtained results based on different geophysical methods are consistent with each other. Joint interpretation of all geological and geophysical data suggests that this region has geothermal potential and an exploration well validated this assessment beyond doubt.
Zheng, Junyu; Yu, Yufan; Mo, Ziwei; Zhang, Zhou; Wang, Xinming; Yin, Shasha; Peng, Kang; Yang, Yang; Feng, Xiaoqiong; Cai, Huihua
2013-07-01
Industrial sector-based VOC source profiles are reported for the Pearl River Delta (PRD) region, China, based source samples (stack emissions and fugitive emissions) analyzed from sources operating under normal conditions. The industrial sectors considered are printing (letterpress, offset and gravure printing processes), wood furniture coating, shoemaking, paint manufacturing and metal surface coating. More than 250 VOC species were detected following US EPA methods TO-14 and TO-15. The results indicated that benzene and toluene were the major species associated with letterpress printing, while ethyl acetate and isopropyl alcohol were the most abundant compounds of other two printing processes. Acetone and 2-butanone were the major species observed in the shoemaking sector. The source profile patterns were found to be similar for the paint manufacturing, wood furniture coating, and metal surface coating sectors, with aromatics being the most abundant group and oxygenated VOCs (OVOCs) as the second largest contributor in the profiles. While OVOCs were one of the most significant VOC groups detected in these five industrial sectors in the PRD region, they have not been reported in most other source profile studies. Such comparisons with other studies show that there are differences in source profiles for different regions or countries, indicating the importance of developing local source profiles. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.
Metz, Thomas O.; Zhang, Qibin; Page, Jason S.; Shen, Yufeng; Callister, Stephen J.; Jacobs, Jon M.; Smith, Richard D.
2008-01-01
SUMMARY The future utility of liquid chromatography-mass spectrometry (LC-MS) in metabolic profiling and metabolomic studies for biomarker discover will be discussed, beginning with a brief description of the evolution of metabolomics and the utilization of the three most popular analytical platforms in such studies: NMR, GC-MS, and LC-MS. Emphasis is placed on recent developments in high-efficiency LC separations, sensitive electrospray ionization approaches, and the benefits to incorporating both in LC-MS-based approaches. The advantages and disadvantages of various quantitative approaches are reviewed, followed by the current LC-MS-based tools available for candidate biomarker characterization and identification. Finally, a brief prediction on the future path of LC-MS-based methods in metabolic profiling and metabolomic studies is given. PMID:19177179
Solution algorithm of dwell time in slope-based figuring model
NASA Astrophysics Data System (ADS)
Li, Yong; Zhou, Lin
2017-10-01
Surface slope profile is commonly used to evaluate X-ray reflective optics, which is used in synchrotron radiation beam. Moreover, the measurement result of measuring instrument for X-ray reflective optics is usually the surface slope profile rather than the surface height profile. To avoid the conversion error, the slope-based figuring model is introduced introduced by processing the X-ray reflective optics based on surface height-based model. However, the pulse iteration method, which can quickly obtain the dell time solution of the traditional height-based figuring model, is not applied to the slope-based figuring model because property of the slope removal function have both positive and negative values and complex asymmetric structure. To overcome this problem, we established the optimal mathematical model for the dwell time solution, By introducing the upper and lower limits of the dwell time and the time gradient constraint. Then we used the constrained least squares algorithm to solve the dwell time in slope-based figuring model. To validate the proposed algorithm, simulations and experiments are conducted. A flat mirror with effective aperture of 80 mm is polished on the ion beam machine. After iterative polishing three times, the surface slope profile error of the workpiece is converged from RMS 5.65 μrad to RMS 1.12 μrad.
Seurinck, Sylvie; Deschepper, Ellen; Deboch, Bishaw; Verstraete, Willy; Siciliano, Steven
2006-03-01
Microbial source tracking (MST) methods need to be rapid, inexpensive and accurate. Unfortunately, many MST methods provide a wealth of information that is difficult to interpret by the regulators who use this information to make decisions. This paper describes the use of classification tree analysis to interpret the results of a MST method based on fatty acid methyl ester (FAME) profiles of Escherichia coli isolates, and to present results in a format readily interpretable by water quality managers. Raw sewage E. coli isolates and animal E. coli isolates from cow, dog, gull, and horse were isolated and their FAME profiles collected. Correct classification rates determined with leaveone-out cross-validation resulted in an overall low correct classification rate of 61%. A higher overall correct classification rate of 85% was obtained when the animal isolates were pooled together and compared to the raw sewage isolates. Bootstrap aggregation or adaptive resampling and combining of the FAME profile data increased correct classification rates substantially. Other MST methods may be better suited to differentiate between different fecal sources but classification tree analysis has enabled us to distinguish raw sewage from animal E. coli isolates, which previously had not been possible with other multivariate methods such as principal component analysis and cluster analysis.
Ruffner, Judith Alison
1999-01-01
A method for coating (flat or non-flat) optical substrates with high-reflectivity multi-layer coatings for use at Deep Ultra-Violet ("DUV") and Extreme Ultra-Violet ("EUV") wavelengths. The method results in a product with minimum feature sizes of less than 0.10-.mu.m for the shortest wavelength (13.4-nm). The present invention employs a computer-based modeling and deposition method to enable lateral and vertical thickness control by scanning the position of the substrate with respect to the sputter target during deposition. The thickness profile of the sputter targets is modeled before deposition and then an appropriate scanning algorithm is implemented to produce any desired, radially-symmetric thickness profile. The present invention offers the ability to predict and achieve a wide range of thickness profiles on flat or figured substrates, i.e., account for 1/R.sup.2 factor in a model, and the ability to predict and accommodate changes in deposition rate as a result of plasma geometry, i.e., over figured substrates.
Xie, Xin-Ping; Xie, Yu-Feng; Wang, Hong-Qiang
2017-08-23
Large-scale accumulation of omics data poses a pressing challenge of integrative analysis of multiple data sets in bioinformatics. An open question of such integrative analysis is how to pinpoint consistent but subtle gene activity patterns across studies. Study heterogeneity needs to be addressed carefully for this goal. This paper proposes a regulation probability model-based meta-analysis, jGRP, for identifying differentially expressed genes (DEGs). The method integrates multiple transcriptomics data sets in a gene regulatory space instead of in a gene expression space, which makes it easy to capture and manage data heterogeneity across studies from different laboratories or platforms. Specifically, we transform gene expression profiles into a united gene regulation profile across studies by mathematically defining two gene regulation events between two conditions and estimating their occurring probabilities in a sample. Finally, a novel differential expression statistic is established based on the gene regulation profiles, realizing accurate and flexible identification of DEGs in gene regulation space. We evaluated the proposed method on simulation data and real-world cancer datasets and showed the effectiveness and efficiency of jGRP in identifying DEGs identification in the context of meta-analysis. Data heterogeneity largely influences the performance of meta-analysis of DEGs identification. Existing different meta-analysis methods were revealed to exhibit very different degrees of sensitivity to study heterogeneity. The proposed method, jGRP, can be a standalone tool due to its united framework and controllable way to deal with study heterogeneity.
NASA Technical Reports Server (NTRS)
Zhao, Feng; Yang, Xiaoyuan; Schull, Mithcell A.; Roman-Colon, Miguel O.; Yao, Tian; Wang, Zhuosen; Zhang, Qingling; Jupp, David L. B.; Lovell, Jenny L.; Culvenor, Darius;
2011-01-01
Effective leaf area index (LAI) retrievals from a scanning, ground-based, near-infrared (1064 nm) lidar that digitizes the full return waveform, the Echidna Validation Instrument (EVI), are in good agreement with those obtained from both hemispherical photography and the Li-Cor LAI-2000 Plant Canopy Analyzer. We conducted trials at 28 plots within six stands of hardwoods and conifers of varying height and stocking densities at Harvard Forest, Massachusetts, Bartlett Experimental Forest, New Hampshire, and Howland Experimental Forest, Maine, in July 2007. Effective LAI values retrieved by four methods, which ranged from 3.42 to 5.25 depending on the site and method, were not significantly different ( b0.1 among four methods). The LAI values also matched published values well. Foliage profiles (leaf area with height) retrieved from the lidar scans, although not independently validated, were consistent with stand structure as observed and as measured by conventional methods. Canopy mean top height, as determined from the foliage profiles, deviated from mean RH100 values obtained from the Lidar Vegetation Imaging Sensor (LVIS) airborne large-footprint lidar system at 27 plots by .0.91 m with RMSE=2.04 m, documenting the ability of the EVI to retrieve stand height. The Echidna Validation Instrument is the first realization of the Echidna lidar concept, devised by Australia's Commonwealth Scientific and Industrial Research Organization (CSIRO), for measuring forest structure using full-waveform, ground-based, scanning lidar.
Dynamic optimization of open-loop input signals for ramp-up current profiles in tokamak plasmas
NASA Astrophysics Data System (ADS)
Ren, Zhigang; Xu, Chao; Lin, Qun; Loxton, Ryan; Teo, Kok Lay
2016-03-01
Establishing a good current spatial profile in tokamak fusion reactors is crucial to effective steady-state operation. The evolution of the current spatial profile is related to the evolution of the poloidal magnetic flux, which can be modeled in the normalized cylindrical coordinates using a parabolic partial differential equation (PDE) called the magnetic diffusion equation. In this paper, we consider the dynamic optimization problem of attaining the best possible current spatial profile during the ramp-up phase of the tokamak. We first use the Galerkin method to obtain a finite-dimensional ordinary differential equation (ODE) model based on the original magnetic diffusion PDE. Then, we combine the control parameterization method with a novel time-scaling transformation to obtain an approximate optimal parameter selection problem, which can be solved using gradient-based optimization techniques such as sequential quadratic programming (SQP). This control parameterization approach involves approximating the tokamak input signals by piecewise-linear functions whose slopes and break-points are decision variables to be optimized. We show that the gradient of the objective function with respect to the decision variables can be computed by solving an auxiliary dynamic system governing the state sensitivity matrix. Finally, we conclude the paper with simulation results for an example problem based on experimental data from the DIII-D tokamak in San Diego, California.
Developmental Profiles of Eczema, Wheeze, and Rhinitis: Two Population-Based Birth Cohort Studies
2014-01-01
Background The term “atopic march” has been used to imply a natural progression of a cascade of symptoms from eczema to asthma and rhinitis through childhood. We hypothesize that this expression does not adequately describe the natural history of eczema, wheeze, and rhinitis during childhood. We propose that this paradigm arose from cross-sectional analyses of longitudinal studies, and may reflect a population pattern that may not predominate at the individual level. Methods and Findings Data from 9,801 children in two population-based birth cohorts were used to determine individual profiles of eczema, wheeze, and rhinitis and whether the manifestations of these symptoms followed an atopic march pattern. Children were assessed at ages 1, 3, 5, 8, and 11 y. We used Bayesian machine learning methods to identify distinct latent classes based on individual profiles of eczema, wheeze, and rhinitis. This approach allowed us to identify groups of children with similar patterns of eczema, wheeze, and rhinitis over time. Using a latent disease profile model, the data were best described by eight latent classes: no disease (51.3%), atopic march (3.1%), persistent eczema and wheeze (2.7%), persistent eczema with later-onset rhinitis (4.7%), persistent wheeze with later-onset rhinitis (5.7%), transient wheeze (7.7%), eczema only (15.3%), and rhinitis only (9.6%). When latent variable modelling was carried out separately for the two cohorts, similar results were obtained. Highly concordant patterns of sensitisation were associated with different profiles of eczema, rhinitis, and wheeze. The main limitation of this study was the difference in wording of the questions used to ascertain the presence of eczema, wheeze, and rhinitis in the two cohorts. Conclusions The developmental profiles of eczema, wheeze, and rhinitis are heterogeneous; only a small proportion of children (∼7% of those with symptoms) follow trajectory profiles resembling the atopic march. Please see later in the article for the Editors' Summary PMID:25335105
NASA Astrophysics Data System (ADS)
Murgan, I.; Candel, I.; Ioana, C.; Digulescu, A.; Bunea, F.; Ciocan, G. D.; Anghel, A.; Vasile, G.
2016-11-01
In this paper, we present a novel approach to non-intrusive flow velocity profiling technique using multi-element sensor array and wide-band signal's processing methods. Conventional techniques for the measurements of the flow velocity profiles are usually based on intrusive instruments (current meters, acoustic Doppler profilers, Pitot tubes, etc.) that take punctual velocity readings. Although very efficient, these choices are limited in terms of practical cases of applications especially when non-intrusive measurements techniques are required and/or a spatial accuracy of the velocity profiling is required This is due to factors related to hydraulic machinery down time, the often long time duration needed to explore the entire section area, the frequent cumbersome number of devices that needs to be handled simultaneously, or the impossibility to perform intrusive tests. In the case of non-intrusive flow profiling methods based on acoustic techniques, previous methods concentrated on using a large number of acoustic transducers placed around the measured section. Although feasible, this approach presents several major drawbacks such as a complicated signal timing, transmission, acquisition and recording system, resulting in a relative high cost of operation. In addition, because of the geometrical constraints, a desired number of sensors may not be installed. Recent results in acoustic flow metering based on wide band signals and adaptive beamforming proved that it is possible to achieve flow velocity profiles using less acoustic transducers. In a normal acoustic time of flight path the transducers are both emitters and receivers, sequentially changing their roles. In the new configuration, proposed in this paper, two new receivers are added on each side. Since the beam angles of each acoustic transducer are wide enough the newly added transducers can receive the transmitted signals and additional time of flight estimation can be done. Thus, several flow velocities are possible to be computed. Analytically defined emitted wide band signals makes possible the identification of signals coming from each transducer. Using the adaptive beam-forming algorithm the receiving transducers can record different signals from the receiver, equivalent to different propagation paths. Therefore, different measurements of time of flight are possible, leading to additional flow velocity measurements. Results carried out in an experiment facility belonging to ICPE-CA, Bucharest - Romania allowed to the validation of the flow velocities computed using this new technique, in symmetric, asymmetric and uneven flow conditions. The acoustic derived values were referenced with those provided from a Pitot tube probe installed in the test channel and the results obtained by the method proposed in this paper are relatively close to this reference.
Comparison of two methods of MMPI-2 profile classification.
Munley, P H; Germain, J M
2000-10-01
The present study investigated the extent of agreement of the highest scale method and the best-fit method in matching MMPI-2 profiles to database code-type profiles and considered profile characteristics that may relate to agreement or disagreement of code-type matches by these two methods. A sample of 519 MMPI-2 profiles that had been classified into database profile code types by these two methods was studied. Resulting code-type matches were classified into three groups: identical (30%), similar (39%), and different (31%), and the profile characteristics of profile elevation, dispersion, and profile code-type definition were studied. Profile code-type definition was significantly different across the three groups with identical and similar match profile groups showing greater profile code-type definition and the different group consisting of profiles that were less well-defined.
Development and In Vitro Bioactivity Profiling of Alternative Sustainable Nanomaterials
Sustainable, environmentally benign nanomaterials (NMs) are being designed as alternatives based on functionality to conventional metal-based nanomaterials (NMs) in order to minimize potential risk to human health and the environment. Development of rapid methods to evaluate the ...
Expectation maximization for hard X-ray count modulation profiles
NASA Astrophysics Data System (ADS)
Benvenuto, F.; Schwartz, R.; Piana, M.; Massone, A. M.
2013-07-01
Context. This paper is concerned with the image reconstruction problem when the measured data are solar hard X-ray modulation profiles obtained from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) instrument. Aims: Our goal is to demonstrate that a statistical iterative method classically applied to the image deconvolution problem is very effective when utilized to analyze count modulation profiles in solar hard X-ray imaging based on rotating modulation collimators. Methods: The algorithm described in this paper solves the maximum likelihood problem iteratively and encodes a positivity constraint into the iterative optimization scheme. The result is therefore a classical expectation maximization method this time applied not to an image deconvolution problem but to image reconstruction from count modulation profiles. The technical reason that makes our implementation particularly effective in this application is the use of a very reliable stopping rule which is able to regularize the solution providing, at the same time, a very satisfactory Cash-statistic (C-statistic). Results: The method is applied to both reproduce synthetic flaring configurations and reconstruct images from experimental data corresponding to three real events. In this second case, the performance of expectation maximization, when compared to Pixon image reconstruction, shows a comparable accuracy and a notably reduced computational burden; when compared to CLEAN, shows a better fidelity with respect to the measurements with a comparable computational effectiveness. Conclusions: If optimally stopped, expectation maximization represents a very reliable method for image reconstruction in the RHESSI context when count modulation profiles are used as input data.
Towards a Quality Assessment Method for Learning Preference Profiles in Negotiation
NASA Astrophysics Data System (ADS)
Hindriks, Koen V.; Tykhonov, Dmytro
In automated negotiation, information gained about an opponent's preference profile by means of learning techniques may significantly improve an agent's negotiation performance. It therefore is useful to gain a better understanding of how various negotiation factors influence the quality of learning. The quality of learning techniques in negotiation are typically assessed indirectly by means of comparing the utility levels of agreed outcomes and other more global negotiation parameters. An evaluation of learning based on such general criteria, however, does not provide any insight into the influence of various aspects of negotiation on the quality of the learned model itself. The quality may depend on such aspects as the domain of negotiation, the structure of the preference profiles, the negotiation strategies used by the parties, and others. To gain a better understanding of the performance of proposed learning techniques in the context of negotiation and to be able to assess the potential to improve the performance of such techniques a more systematic assessment method is needed. In this paper we propose such a systematic method to analyse the quality of the information gained about opponent preferences by learning in single-instance negotiations. The method includes measures to assess the quality of a learned preference profile and proposes an experimental setup to analyse the influence of various negotiation aspects on the quality of learning. We apply the method to a Bayesian learning approach for learning an opponent's preference profile and discuss our findings.
Evaluation and Application of Satellite-Based Latent Heating Profile Estimation Methods
NASA Technical Reports Server (NTRS)
Olson, William S.; Grecu, Mircea; Yang, Song; Tao, Wei-Kuo
2004-01-01
In recent years, methods for estimating atmospheric latent heating vertical structure from both passive and active microwave remote sensing have matured to the point where quantitative evaluation of these methods is the next logical step. Two approaches for heating algorithm evaluation are proposed: First, application of heating algorithms to synthetic data, based upon cloud-resolving model simulations, can be used to test the internal consistency of heating estimates in the absence of systematic errors in physical assumptions. Second, comparisons of satellite-retrieved vertical heating structures to independent ground-based estimates, such as rawinsonde-derived analyses of heating, provide an additional test. The two approaches are complementary, since systematic errors in heating indicated by the second approach may be confirmed by the first. A passive microwave and combined passive/active microwave heating retrieval algorithm are evaluated using the described approaches. In general, the passive microwave algorithm heating profile estimates are subject to biases due to the limited vertical heating structure information contained in the passive microwave observations. These biases may be partly overcome by including more environment-specific a priori information into the algorithm s database of candidate solution profiles. The combined passive/active microwave algorithm utilizes the much higher-resolution vertical structure information provided by spaceborne radar data to produce less biased estimates; however, the global spatio-temporal sampling by spaceborne radar is limited. In the present study, the passive/active microwave algorithm is used to construct a more physically-consistent and environment-specific set of candidate solution profiles for the passive microwave algorithm and to help evaluate errors in the passive algorithm s heating estimates. Although satellite estimates of latent heating are based upon instantaneous, footprint- scale data, suppression of random errors requires averaging to at least half-degree resolution. Analysis of mesoscale and larger space-time scale phenomena based upon passive and passive/active microwave heating estimates from TRMM, SSMI, and AMSR data will be presented at the conference.
NASA Astrophysics Data System (ADS)
Zasova, L.; Formisano, V.; Grassi, D.; Igantiev, N.; Moroz, V.
Thermal IR spectrometry is one of the methods of the Martian atmosphere investigation below 55 km. The temperature profiles retrieved from the 15 μm CO2 band may be used for MIRA database. This approach gives the vertical resolution of several kilometers and accuracy of several Kelvins. An aerosol abundance, which influences the temperature profiles, is obtained from the continuum of the same spectrum. It is taken into account in the temperature retrieval procedure in a self- consistent way. Although this method has limited vertical resolution it possesses some advantages. For example, the radio occultation method gives the temperature profiles with higher spectral resolution, but the radio observations are sparse in space and local time. Direct measurements, which give the most accurate results, enable to obtain the temperature profiles only for some chosen points (landing places). Actually, the thermal IR-spectrometry is the only method, which allows to monitor the temperature profiles with good coverage both in space and local time. The first measurements of this kind were fulfilled by IRIS, installed on board of Mariner 9. This spectrometer was characterized by rather high spectral resolution (2.4 cm-1). The temperature profiles vs. local time dependencies for different latitudes and seasons were retrieved, including dust storm conditions, North polar night, Tharsis volcanoes. The obtained temperature profiles have been compared with the temperature profiles for the same conditions, taken from Climate Data Base (European GCM). The Planetary Fourier Spectrometer onboard Mars Express (which is planned to be launched in 2003) has the spectral range 1.2-45 μm and spectral resolution of 1.5 cm- 1. Temperature retrieval is one of the main scientific goals of the experiment. It opens a possibility to get a series of temperature profiles taken for different conditions, which can later be used in MIRA producing.
Gerhardt, Natalie; Birkenmeier, Markus; Schwolow, Sebastian; Rohn, Sascha; Weller, Philipp
2018-02-06
This work describes a simple approach for the untargeted profiling of volatile compounds for the authentication of the botanical origins of honey based on resolution-optimized HS-GC-IMS combined with optimized chemometric techniques, namely PCA, LDA, and kNN. A direct comparison of the PCA-LDA models between the HS-GC-IMS and 1 H NMR data demonstrated that HS-GC-IMS profiling could be used as a complementary tool to NMR-based profiling of honey samples. Whereas NMR profiling still requires comparatively precise sample preparation, pH adjustment in particular, HS-GC-IMS fingerprinting may be considered an alternative approach for a truly fully automatable, cost-efficient, and in particular highly sensitive method. It was demonstrated that all tested honey samples could be distinguished on the basis of their botanical origins. Loading plots revealed the volatile compounds responsible for the differences among the monofloral honeys. The HS-GC-IMS-based PCA-LDA model was composed of two linear functions of discrimination and 10 selected PCs that discriminated canola, acacia, and honeydew honeys with a predictive accuracy of 98.6%. Application of the LDA model to an external test set of 10 authentic honeys clearly proved the high predictive ability of the model by correctly classifying them into three variety groups with 100% correct classifications. The constructed model presents a simple and efficient method of analysis and may serve as a basis for the authentication of other food types.
Nurius, Paula S; Macy, Rebecca J
2010-06-01
Violence researchers have called for the use of person-oriented methods to understand differences that have been found in biopsychosocial consequences among those who experience intimate partner violence (IPV). To address this issue, we apply a person-oriented statistical method, latent profile analysis (LPA), to test for meaningful subgroups of a sample of 448 battered women based on participants' appraisals of their vulnerability relative to their violent partner, depressive symptoms, physical injuries, overall physical health functioning, and their positive and negative social relationships with friends and family. The LPA established five significantly distinct subgroups. Using MANOVA, we examined these subgroups and their respective IPV exposure, both concomitant and separate incidents within the past year. Those with the most intensive violence exposure show the greatest level of challenge and impairment. However, the groups with comparable levels of IPV exposure manifest distinctly different configurations of biopsychosocial profiles, indicating a need for adaptive interventions commensurate with these profiles. We discuss the implications these findings have for developing adaptive interventions for battered women, as well as the potential utility of person-oriented tools for violence researchers.
Jasińska-Stroschein, Magdalena; Kurczewska, Urszula; Orszulak-Michalak, Daria
2017-05-01
When performing in vitro dissolution testing, especially in the area of biowaivers, it is necessary to follow regulatory guidelines to minimize the risk of an unsafe or ineffective product being approved. The present study examines model-independent and model-dependent methods of comparing dissolution profiles based on various compared and contrasted international guidelines. Dissolution profiles for immediate release solid oral dosage forms were generated. The test material comprised tablets containing several substances, with at least 85% of the labeled amount dissolved within 15 min, 20-30 min, or 45 min. Dissolution profile similarity can vary with regard to the following criteria: time point selection (including the last time point), coefficient of variation, and statistical method selection. Variation between regulatory guidance and statistical methods can raise methodological questions and result potentially in a different outcome when reporting dissolution profile testing. The harmonization of existing guidelines would address existing problems concerning the interpretation of regulatory recommendations and research findings. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Nurius, Paula S.; Macy, Rebecca J.
2014-01-01
Violence researchers have called for the use of person-oriented methods to understand differences that have been found in biopsychosocial consequences among those who experience intimate partner violence (IPV). To address this issue, we apply a person-oriented statistical method, latent profile analysis (LPA), to test for meaningful subgroups of a sample of 448 battered women based on participants’ appraisals of their vulnerability relative to their violent partner, depressive symptoms, physical injuries, overall physical health functioning, and their positive and negative social relationships with friends and family. The LPA established five significantly distinct subgroups. Using MANOVA, we examined these subgroups and their respective IPV exposure, both concomitant and separate incidents within the past year. Those with the most intensive violence exposure show the greatest level of challenge and impairment. However, the groups with comparable levels of IPV exposure manifest distinctly different configurations of biopsychosocial profiles, indicating a need for adaptive interventions commensurate with these profiles. We discuss the implications these findings have for developing adaptive interventions for battered women, as well as the potential utility of person-oriented tools for violence researchers. PMID:19897777
Lamsa, Anne; Lopez-Garrido, Javier; Quach, Diana; Riley, Eammon P; Pogliano, Joe; Pogliano, Kit
2016-08-19
Increasing antimicrobial resistance has become a major public health crisis. New antimicrobials with novel mechanisms of action (MOA) are desperately needed. We previously developed a method, bacterial cytological profiling (BCP), which utilizes fluorescence microscopy to rapidly identify the MOA of antimicrobial compounds. BCP is based upon our discovery that cells treated with antibiotics affecting different metabolic pathways generate different cytological signatures, providing quantitative information that can be used to determine a compound's MOA. Here, we describe a system, rapid inhibition profiling (RIP), for creating cytological profiles of new antibiotic targets for which there are currently no chemical inhibitors. RIP consists of the fast, inducible degradation of a target protein followed by BCP. We demonstrate that degrading essential proteins in the major metabolic pathways for DNA replication, transcription, fatty acid biosynthesis, and peptidoglycan biogenesis in Bacillus subtilis rapidly produces cytological profiles closely matching that of antimicrobials targeting the same pathways. Additionally, RIP and antibiotics targeting different steps in fatty acid biosynthesis can be differentiated from each other. We utilize RIP and BCP to show that the antibacterial MOA of four nonsteroidal anti-inflammatory antibiotics differs from that proposed based on in vitro data. RIP is a versatile method that will extend our knowledge of phenotypes associated with inactivating essential bacterial enzymes and thereby allow for screening for molecules that inhibit novel essential targets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zafar, A., E-mail: zafara@ornl.gov; Oak Ridge National Laboratory, Oak Ridge, Tennessee 37830; Martin, E. H.
2016-11-15
An electron density diagnostic (≥10{sup 10} cm{sup −3}) capable of high temporal (ms) and spatial (mm) resolution is currently under development at Oak Ridge National Laboratory. The diagnostic is based on measuring the Stark broadened, Doppler-free spectral line profile of the n = 6–2 hydrogen Balmer series transition. The profile is then fit to a fully quantum mechanical model including the appropriate electric and magnetic field operators. The quasi-static approach used to calculate the Doppler-free spectral line profile is outlined here and the results from the model are presented for H-δ spectra for electron densities of 10{sup 10}–10{sup 13} cm{supmore » −3}. The profile shows complex behavior due to the interaction between the magnetic substates of the atom.« less
Advantages and Pitfalls of Mass Spectrometry Based Metabolome Profiling in Systems Biology.
Aretz, Ina; Meierhofer, David
2016-04-27
Mass spectrometry-based metabolome profiling became the method of choice in systems biology approaches and aims to enhance biological understanding of complex biological systems. Genomics, transcriptomics, and proteomics are well established technologies and are commonly used by many scientists. In comparison, metabolomics is an emerging field and has not reached such high-throughput, routine and coverage than other omics technologies. Nevertheless, substantial improvements were achieved during the last years. Integrated data derived from multi-omics approaches will provide a deeper understanding of entire biological systems. Metabolome profiling is mainly hampered by its diversity, variation of metabolite concentration by several orders of magnitude and biological data interpretation. Thus, multiple approaches are required to cover most of the metabolites. No software tool is capable of comprehensively translating all the data into a biologically meaningful context yet. In this review, we discuss the advantages of metabolome profiling and main obstacles limiting progress in systems biology.
Advantages and Pitfalls of Mass Spectrometry Based Metabolome Profiling in Systems Biology
Aretz, Ina; Meierhofer, David
2016-01-01
Mass spectrometry-based metabolome profiling became the method of choice in systems biology approaches and aims to enhance biological understanding of complex biological systems. Genomics, transcriptomics, and proteomics are well established technologies and are commonly used by many scientists. In comparison, metabolomics is an emerging field and has not reached such high-throughput, routine and coverage than other omics technologies. Nevertheless, substantial improvements were achieved during the last years. Integrated data derived from multi-omics approaches will provide a deeper understanding of entire biological systems. Metabolome profiling is mainly hampered by its diversity, variation of metabolite concentration by several orders of magnitude and biological data interpretation. Thus, multiple approaches are required to cover most of the metabolites. No software tool is capable of comprehensively translating all the data into a biologically meaningful context yet. In this review, we discuss the advantages of metabolome profiling and main obstacles limiting progress in systems biology. PMID:27128910
Retrieval of the Nitrous Oxide Profiles using the AIRS Data in China
NASA Astrophysics Data System (ADS)
Chen, L.; Ma, P.; Tao, J.; Li, X.; Zhang, Y.; Wang, Z.; Li, S.; Xiong, X.
2014-12-01
As an important greenhouse gas and ozone-depleting substance, the 100-year global warming potential of Nitrous Oxide (N2O) is almost 300 times higher than that of carbon dioxide. However, there are still large uncertainties about the quantitative N2O emission and its feedback to climate change due to the coarse ground-based network. This approach attempts to retrieve the N2O profiles from the Atmospheric InfraRed Sounder (AIRS) data. First, the sensitivity of atmospheric temperature and humidity profiles and surface parameters between two spectral absorption bands were simulated by using the radiative transfer model. Second, the eigenvector regression algorithm is used to construct a priori state. Third, an optimal estimate method was developed based on the band selection of N2O. Finally, we compared our retrieved AIRS profiles with HIPPO data, and analyzed the seasonal and annual N2O distribution in China from 2004 to 2013.
Lin, Long-Ze; Harnly, James M
2008-11-12
A screening method using LC-DAD-ESI/MS was developed for the identification of common hydroxycinnamoylquinic acids based on direct comparison with standards. A complete standard set for mono-, di-, and tricaffeoylquinic isomers was assembled from commercially available standards, positively identified compounds in common plants (artichokes, asparagus, coffee bean, honeysuckle flowers, sweet potato, and Vernonia amygdalina leaves) and chemically modified standards. Four C18 reversed phase columns were tested using the standardized profiling method (based on LC-DAD-ESI/MS) for 30 phenolic compounds, and their elution order and retention times were evaluated. Using only two columns under standardized LC condition and the collected phenolic compound database, it was possible to separate all of the hydroxycinnamoylquinic acid conjugates and to identify 28 and 18 hydroxycinnamoylquinic acids in arnica flowers (Arnica montana L.) and burdock roots (Arctium lappa L.), respectively. Of these, 22 are reported for the first time.
A laser based frequency modulated NL-OSL phenomenon
NASA Astrophysics Data System (ADS)
Mishra, D. R.; Bishnoi, A. S.; Soni, Anuj; Rawat, N. S.; Bhatt, B. C.; Kulkarni, M. S.; Babu, D. A. R.
2015-01-01
The detailed theoretical and experimental approach to novel technique of pulse frequency modulated stimulation (PFMS) method has been described for NL-OSL phenomenon. This method involved pulsed frequency modulation with respect to time for fixed pulse width of 532 nm continuous wave (CW)-laser light. The linearly modulated (LM)-, non-linearly (NL)-stimulation profiles have been generated using fast electromagnetic optical shutter. The PFMS parameters have been determined for present experimental setup. The PFMS based LM-, NL-OSL studies have been carried out on dosimetry grade single crystal α-Al2O3:C. The photo ionization cross section of α-Al2O3:C has been found to be ∼9.97 × 10-19 cm2 for 532 nm laser light using PFMS LM-OSL studies under assumption of first order of kinetic. This method of PFMS is found to be a potential alternative to generate different stimulation profiles using CW-light sources.
Activity-based protein profiling: from enzyme chemistry to proteomic chemistry.
Cravatt, Benjamin F; Wright, Aaron T; Kozarich, John W
2008-01-01
Genome sequencing projects have provided researchers with a complete inventory of the predicted proteins produced by eukaryotic and prokaryotic organisms. Assignment of functions to these proteins represents one of the principal challenges for the field of proteomics. Activity-based protein profiling (ABPP) has emerged as a powerful chemical proteomic strategy to characterize enzyme function directly in native biological systems on a global scale. Here, we review the basic technology of ABPP, the enzyme classes addressable by this method, and the biological discoveries attributable to its application.
NASA Astrophysics Data System (ADS)
Loughman, Robert; Bhartia, Pawan K.; Chen, Zhong; Xu, Philippe; Nyaku, Ernest; Taha, Ghassan
2018-05-01
The theoretical basis of the Ozone Mapping and Profiler Suite (OMPS) Limb Profiler (LP) Version 1 aerosol extinction retrieval algorithm is presented. The algorithm uses an assumed bimodal lognormal aerosol size distribution to retrieve aerosol extinction profiles at 675 nm from OMPS LP radiance measurements. A first-guess aerosol extinction profile is updated by iteration using the Chahine nonlinear relaxation method, based on comparisons between the measured radiance profile at 675 nm and the radiance profile calculated by the Gauss-Seidel limb-scattering (GSLS) radiative transfer model for a spherical-shell atmosphere. This algorithm is discussed in the context of previous limb-scattering aerosol extinction retrieval algorithms, and the most significant error sources are enumerated. The retrieval algorithm is limited primarily by uncertainty about the aerosol phase function. Horizontal variations in aerosol extinction, which violate the spherical-shell atmosphere assumed in the version 1 algorithm, may also limit the quality of the retrieved aerosol extinction profiles significantly.
A field-to-desktop toolchain for X-ray CT densitometry enables tree ring analysis
De Mil, Tom; Vannoppen, Astrid; Beeckman, Hans; Van Acker, Joris; Van den Bulcke, Jan
2016-01-01
Background and Aims Disentangling tree growth requires more than ring width data only. Densitometry is considered a valuable proxy, yet laborious wood sample preparation and lack of dedicated software limit the widespread use of density profiling for tree ring analysis. An X-ray computed tomography-based toolchain of tree increment cores is presented, which results in profile data sets suitable for visual exploration as well as density-based pattern matching. Methods Two temperate (Quercus petraea, Fagus sylvatica) and one tropical species (Terminalia superba) were used for density profiling using an X-ray computed tomography facility with custom-made sample holders and dedicated processing software. Key Results Density-based pattern matching is developed and able to detect anomalies in ring series that can be corrected via interactive software. Conclusions A digital workflow allows generation of structure-corrected profiles of large sets of cores in a short time span that provide sufficient intra-annual density information for tree ring analysis. Furthermore, visual exploration of such data sets is of high value. The dated profiles can be used for high-resolution chronologies and also offer opportunities for fast screening of lesser studied tropical tree species. PMID:27107414
A novel method for fabrication of continuous-relief optical elements
NASA Astrophysics Data System (ADS)
Guo, Xiaowei; Du, Jinglei; Chen, Mingyong; Ma, Yanqin; Zhu, Jianhua; Peng, Qinjun; Guo, Yongkang; Du, Chunlei
2005-08-01
A novel method for the fabrication of continuous micro-optical components is presented in this paper. It employs a computer controlled spatial-light-modulator (SLM) as a switchable projection mask and silver-halide sensitized gelatin (SHSG) as recording material. By etching SHSG with enzyme solution, the micro-optical components with relief modulation can be generated through special processing procedures. The principles of digital SLM-based lithography and enzyme etching SHSG are discussed in detail, and microlens arrays, micro axicon-lens arrays and gratings with good profile were achieved. This method is simple, cheap and the aberration in processing procedures can be in-situ corrected in the step of designing mask, so it is a practical method to fabricate continuous profile for low-volume production.
A Direct Cell Quenching Method for Cell-Culture Based Metabolomics
A crucial step in metabolomic analysis of cellular extracts is the cell quenching process. The conventional method first uses trypsin to detach cells from their growth surface. This inevitably changes the profile of cellular metabolites since the detachment of cells from the extr...
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
Image analysis of pubic bone for age estimation in a computed tomography sample.
López-Alcaraz, Manuel; González, Pedro Manuel Garamendi; Aguilera, Inmaculada Alemán; López, Miguel Botella
2015-03-01
Radiology has demonstrated great utility for age estimation, but most of the studies are based on metrical and morphological methods in order to perform an identification profile. A simple image analysis-based method is presented, aimed to correlate the bony tissue ultrastructure with several variables obtained from the grey-level histogram (GLH) of computed tomography (CT) sagittal sections of the pubic symphysis surface and the pubic body, and relating them with age. The CT sample consisted of 169 hospital Digital Imaging and Communications in Medicine (DICOM) archives of known sex and age. The calculated multiple regression models showed a maximum R (2) of 0.533 for females and 0.726 for males, with a high intra- and inter-observer agreement. The method suggested is considered not only useful for performing an identification profile during virtopsy, but also for application in further studies in order to attach a quantitative correlation for tissue ultrastructure characteristics, without complex and expensive methods beyond image analysis.
An Effective News Recommendation Method for Microblog User
Gu, Wanrong; Dong, Shoubin; Zeng, Zhizhao; He, Jinchao
2014-01-01
Recommending news stories to users, based on their preferences, has long been a favourite domain for recommender systems research. Traditional systems strive to satisfy their user by tracing users' reading history and choosing the proper candidate news articles to recommend. However, most of news websites hardly require any user to register before reading news. Besides, the latent relations between news and microblog, the popularity of particular news, and the news organization are not addressed or solved efficiently in previous approaches. In order to solve these issues, we propose an effective personalized news recommendation method based on microblog user profile building and sub class popularity prediction, in which we propose a news organization method using hybrid classification and clustering, implement a sub class popularity prediction method, and construct user profile according to our actual situation. We had designed several experiments compared to the state-of-the-art approaches on a real world dataset, and the experimental results demonstrate that our system significantly improves the accuracy and diversity in mass text data. PMID:24983011
González, Nerea; Iloro, Ibon; Durán, Juan A.; Elortza, Félix
2012-01-01
Purpose To characterize the tear film peptidome and low molecular weight protein profiles of healthy control individuals, and to evaluate changes due to day-to-day and individual variation and tear collection methods, by using solid phase extraction coupled to matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) profiling. Methods The tear protein profiles of six healthy volunteers were analyzed over seven days and inter-day and inter-individual variability was evaluated. The bilaterality of tear film and the effect of tear collection methods on protein profiles were also analyzed in some of these patients. MALDI-TOF MS analyses were performed on tear samples purified by using a solid phase extraction (SPE) method based on C18 functionalized magnetic beads for peptide and low molecular weight protein enrichment, focusing spectra acquisition on the 1 to 20 kDa range. Spectra were analyzed using principal component analysis (PCA) with MultiExperiment Viewer (TMeV) software. Volunteers were examined in terms of tear production status (Schirmer I test), clinical assessment of palpebral lids and meibomian glands, and a subjective OSD questionnaire before tear collection by a glass micro-capillary. Results Analysis of peptides and proteins in the 1–20 kDa range showed no significant inter-day differences in tear samples collected from six healthy individuals during seven days of monitoring, but revealed subtle intrinsic inter-individual differences. Profile analyses of tears collected from the right and left eyes confirmed tear bilaterality in four healthy patients. The addition of physiologic serum for tear sample collection did not affect the peptide and small protein profiles with respect to the number of resolved peaks, but it did reduce the signal intensity of the peaks, and increased variability. Magnetic beads were found to be a suitable method for tear film purification for the profiling study. Conclusions No significant variability in tear peptide and protein profiles below 20 kDa was found in healthy controls over a seven day period, nor in right versus left eye profiles from the same individual. Subtle inter-individual differences can be observed upon tear profiling analysis and confirm intrinsic variability between control subjects. Addition of physiologic serum for tear collection affects the proteome and peptidome in terms of peak intensities, but not in the composition of the profiles themselves. This work shows that MALDI-TOF MS coupled with C18 magnetic beads is an effective and reproducible methodology for tear profiling studies in the clinical monitoring of patients. PMID:22736947
Controller for thermostatically controlled loads
Lu, Ning; Zhang, Yu; Du, Pengwei; Makarov, Yuri V.
2016-06-07
A system and method of controlling aggregated thermostatically controlled appliances (TCAs) for demand response is disclosed. A targeted load profile is formulated and a forecasted load profile is generated. The TCAs within an "on" or "off" control group are prioritized based on their operating temperatures. The "on" or "off" status of the TCAs is determined. Command signals are sent to turn on or turn off the TCAs.
ERIC Educational Resources Information Center
Barbot, Baptiste; Haeffel, Gerald J.; Macomber, Donna; Hart, Lesley; Chapman, John; Grigorenko, Elena L.
2012-01-01
The "Delinquency Reduction Outcome Profile" ("DROP") is a novel situational-judgment test (SJT) designed to measure social decision making in delinquent youth. The DROP includes both a typical SJT scoring method, which captures the deviation of an individual response from an "ideal" expert-based response pattern, as well as a novel…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barraclough, Brendan; Lebron, Sharon; Li, Jonathan G.
2016-05-15
Purpose: To investigate the geometry dependence of the detector response function (DRF) of three commonly used scanning ionization chambers and its impact on a convolution-based method to address the volume averaging effect (VAE). Methods: A convolution-based approach has been proposed recently to address the ionization chamber VAE. It simulates the VAE in the treatment planning system (TPS) by iteratively convolving the calculated beam profiles with the DRF while optimizing the beam model. Since the convolved and the measured profiles are subject to the same VAE, the calculated profiles match the implicit “real” ones when the optimization converges. Three DRFs (Gaussian,more » Lorentzian, and parabolic function) were used for three ionization chambers (CC04, CC13, and SNC125c) in this study. Geometry dependent/independent DRFs were obtained by minimizing the difference between the ionization chamber-measured profiles and the diode-measured profiles convolved with the DRFs. These DRFs were used to obtain eighteen beam models for a commercial TPS. Accuracy of the beam models were evaluated by assessing the 20%–80% penumbra width difference (PWD) between the computed and diode-measured beam profiles. Results: The convolution-based approach was found to be effective for all three ionization chambers with significant improvement for all beam models. Up to 17% geometry dependence of the three DRFs was observed for the studied ionization chambers. With geometry dependent DRFs, the PWD was within 0.80 mm for the parabolic function and CC04 combination and within 0.50 mm for other combinations; with geometry independent DRFs, the PWD was within 1.00 mm for all cases. When using the Gaussian function as the DRF, accounting for geometry dependence led to marginal improvement (PWD < 0.20 mm) for CC04; the improvement ranged from 0.38 to 0.65 mm for CC13; for SNC125c, the improvement was slightly above 0.50 mm. Conclusions: Although all three DRFs were found adequate to represent the response of the studied ionization chambers, the Gaussian function was favored due to its superior overall performance. The geometry dependence of the DRFs can be significant for clinical applications involving small fields such as stereotactic radiotherapy.« less
Radiofrequency pulse design using nonlinear gradient magnetic fields.
Kopanoglu, Emre; Constable, R Todd
2015-09-01
An iterative k-space trajectory and radiofrequency (RF) pulse design method is proposed for excitation using nonlinear gradient magnetic fields. The spatial encoding functions (SEFs) generated by nonlinear gradient fields are linearly dependent in Cartesian coordinates. Left uncorrected, this may lead to flip angle variations in excitation profiles. In the proposed method, SEFs (k-space samples) are selected using a matching pursuit algorithm, and the RF pulse is designed using a conjugate gradient algorithm. Three variants of the proposed approach are given: the full algorithm, a computationally cheaper version, and a third version for designing spoke-based trajectories. The method is demonstrated for various target excitation profiles using simulations and phantom experiments. The method is compared with other iterative (matching pursuit and conjugate gradient) and noniterative (coordinate-transformation and Jacobian-based) pulse design methods as well as uniform density spiral and EPI trajectories. The results show that the proposed method can increase excitation fidelity. An iterative method for designing k-space trajectories and RF pulses using nonlinear gradient fields is proposed. The method can either be used for selecting the SEFs individually to guide trajectory design, or can be adapted to design and optimize specific trajectories of interest. © 2014 Wiley Periodicals, Inc.
Jiang, Hongzhi; Zhao, Huijie; Li, Xudong; Quan, Chenggen
2016-03-07
We propose a novel hyper thin 3D edge measurement technique to measure the profile of 3D outer envelope of honeycomb core structures. The width of the edges of the honeycomb core is less than 0.1 mm. We introduce a triangular layout design consisting of two cameras and one projector to measure hyper thin 3D edges and eliminate data interference from the walls. A phase-shifting algorithm and the multi-frequency heterodyne phase-unwrapping principle are applied for phase retrievals on edges. A new stereo matching method based on phase mapping and epipolar constraint is presented to solve correspondence searching on the edges and remove false matches resulting in 3D outliers. Experimental results demonstrate the effectiveness of the proposed method for measuring the 3D profile of honeycomb core structures.
Nonparametric estimates of drift and diffusion profiles via Fokker-Planck algebra.
Lund, Steven P; Hubbard, Joseph B; Halter, Michael
2014-11-06
Diffusion processes superimposed upon deterministic motion play a key role in understanding and controlling the transport of matter, energy, momentum, and even information in physics, chemistry, material science, biology, and communications technology. Given functions defining these random and deterministic components, the Fokker-Planck (FP) equation is often used to model these diffusive systems. Many methods exist for estimating the drift and diffusion profiles from one or more identifiable diffusive trajectories; however, when many identical entities diffuse simultaneously, it may not be possible to identify individual trajectories. Here we present a method capable of simultaneously providing nonparametric estimates for both drift and diffusion profiles from evolving density profiles, requiring only the validity of Langevin/FP dynamics. This algebraic FP manipulation provides a flexible and robust framework for estimating stationary drift and diffusion coefficient profiles, is not based on fluctuation theory or solved diffusion equations, and may facilitate predictions for many experimental systems. We illustrate this approach on experimental data obtained from a model lipid bilayer system exhibiting free diffusion and electric field induced drift. The wide range over which this approach provides accurate estimates for drift and diffusion profiles is demonstrated through simulation.
Leveraging cues from person-generated health data for peer matching in online communities
Hartzler, Andrea L; Taylor, Megan N; Park, Albert; Griffiths, Troy; Backonja, Uba; McDonald, David W; Wahbeh, Sam; Brown, Cory; Pratt, Wanda
2016-01-01
Objective Online health communities offer a diverse peer support base, yet users can struggle to identify suitable peer mentors as these communities grow. To facilitate mentoring connections, we designed a peer-matching system that automatically profiles and recommends peer mentors to mentees based on person-generated health data (PGHD). This study examined the profile characteristics that mentees value when choosing a peer mentor. Materials and Methods Through a mixed-methods user study, in which cancer patients and caregivers evaluated peer mentor recommendations, we examined the relative importance of four possible profile elements: health interests, language style, demographics, and sample posts. Playing the role of mentees, the study participants ranked mentors, then rated both the likelihood that they would hypothetically contact each mentor and the helpfulness of each profile element in helping the make that decision. We analyzed the participants’ ratings with linear regression and qualitatively analyzed participants’ feedback for emerging themes about choosing mentors and improving profile design. Results Of the four profile elements, only sample posts were a significant predictor for the likelihood of a mentee contacting a mentor. Communication cues embedded in posts were critical for helping the participants choose a compatible mentor. Qualitative themes offer insight into the interpersonal characteristics that mentees sought in peer mentors, including being knowledgeable, sociable, and articulate. Additionally, the participants emphasized the need for streamlined profiles that minimize the time required to choose a mentor. Conclusion Peer-matching systems in online health communities offer a promising approach for leveraging PGHD to connect patients. Our findings point to interpersonal communication cues embedded in PGHD that could prove critical for building mentoring relationships among the growing membership of online health communities. PMID:26911825
Taguchi, Y-H
2018-05-08
Even though coexistence of multiple phenotypes sharing the same genomic background is interesting, it remains incompletely understood. Epigenomic profiles may represent key factors, with unknown contributions to the development of multiple phenotypes, and social-insect castes are a good model for elucidation of the underlying mechanisms. Nonetheless, previous studies have failed to identify genes associated with aberrant gene expression and methylation profiles because of the lack of suitable methodology that can address this problem properly. A recently proposed principal component analysis (PCA)-based and tensor decomposition (TD)-based unsupervised feature extraction (FE) can solve this problem because these two approaches can deal with gene expression and methylation profiles even when a small number of samples is available. PCA-based and TD-based unsupervised FE methods were applied to the analysis of gene expression and methylation profiles in the brains of two social insects, Polistes canadensis and Dinoponera quadriceps. Genes associated with differential expression and methylation between castes were identified, and analysis of enrichment of Gene Ontology terms confirmed reliability of the obtained sets of genes from the biological standpoint. Biologically relevant genes, shown to be associated with significant differential gene expression and methylation between castes, were identified here for the first time. The identification of these genes may help understand the mechanisms underlying epigenetic control of development of multiple phenotypes under the same genomic conditions.
Portable Just-in-Time Specialization of Dynamically Typed Scripting Languages
NASA Astrophysics Data System (ADS)
Williams, Kevin; McCandless, Jason; Gregg, David
In this paper, we present a portable approach to JIT compilation for dynamically typed scripting languages. At runtime we generate ANSI C code and use the system's native C compiler to compile this code. The C compiler runs on a separate thread to the interpreter allowing program execution to continue during JIT compilation. Dynamic languages have variables which may change type at any point in execution. Our interpreter profiles variable types at both whole method and partial method granularity. When a frequently executed region of code is discovered, the compilation thread generates a specialized version of the region based on the profiled types. In this paper, we evaluate the level of instruction specialization achieved by our profiling scheme as well as the overall performance of our JIT.
Goldberg, Tony L; Gillespie, Thomas R; Singer, Randall S
2006-09-01
Repetitive-element PCR (rep-PCR) is a method for genotyping bacteria based on the selective amplification of repetitive genetic elements dispersed throughout bacterial chromosomes. The method has great potential for large-scale epidemiological studies because of its speed and simplicity; however, objective guidelines for inferring relationships among bacterial isolates from rep-PCR data are lacking. We used multilocus sequence typing (MLST) as a "gold standard" to optimize the analytical parameters for inferring relationships among Escherichia coli isolates from rep-PCR data. We chose 12 isolates from a large database to represent a wide range of pairwise genetic distances, based on the initial evaluation of their rep-PCR fingerprints. We conducted MLST with these same isolates and systematically varied the analytical parameters to maximize the correspondence between the relationships inferred from rep-PCR and those inferred from MLST. Methods that compared the shapes of densitometric profiles ("curve-based" methods) yielded consistently higher correspondence values between data types than did methods that calculated indices of similarity based on shared and different bands (maximum correspondences of 84.5% and 80.3%, respectively). Curve-based methods were also markedly more robust in accommodating variations in user-specified analytical parameter values than were "band-sharing coefficient" methods, and they enhanced the reproducibility of rep-PCR. Phylogenetic analyses of rep-PCR data yielded trees with high topological correspondence to trees based on MLST and high statistical support for major clades. These results indicate that rep-PCR yields accurate information for inferring relationships among E. coli isolates and that accuracy can be enhanced with the use of analytical methods that consider the shapes of densitometric profiles.
Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa
2008-01-01
Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.
Temperature and pressure effects on capacitance probe cryogenic liquid level measurement accuracy
NASA Technical Reports Server (NTRS)
Edwards, Lawrence G.; Haberbusch, Mark
1993-01-01
The inaccuracies of liquid nitrogen and liquid hydrogen level measurements by use of a coaxial capacitance probe were investigated as a function of fluid temperatures and pressures. Significant liquid level measurement errors were found to occur due to the changes in the fluids dielectric constants which develop over the operating temperature and pressure ranges of the cryogenic storage tanks. The level measurement inaccuracies can be reduced by using fluid dielectric correction factors based on measured fluid temperatures and pressures. The errors in the corrected liquid level measurements were estimated based on the reported calibration errors of the temperature and pressure measurement systems. Experimental liquid nitrogen (LN2) and liquid hydrogen (LH2) level measurements were obtained using the calibrated capacitance probe equations and also by the dielectric constant correction factor method. The liquid levels obtained by the capacitance probe for the two methods were compared with the liquid level estimated from the fluid temperature profiles. Results show that the dielectric constant corrected liquid levels agreed within 0.5 percent of the temperature profile estimated liquid level. The uncorrected dielectric constant capacitance liquid level measurements deviated from the temperature profile level by more than 5 percent. This paper identifies the magnitude of liquid level measurement error that can occur for LN2 and LH2 fluids due to temperature and pressure effects on the dielectric constants over the tank storage conditions from 5 to 40 psia. A method of reducing the level measurement errors by using dielectric constant correction factors based on fluid temperature and pressure measurements is derived. The improved accuracy by use of the correction factors is experimentally verified by comparing liquid levels derived from fluid temperature profiles.
Nims, Raymond W; Sykes, Greg; Cottrill, Karin; Ikonomi, Pranvera; Elmore, Eugene
2010-12-01
The role of cell authentication in biomedical science has received considerable attention, especially within the past decade. This quality control attribute is now beginning to be given the emphasis it deserves by granting agencies and by scientific journals. Short tandem repeat (STR) profiling, one of a few DNA profiling technologies now available, is being proposed for routine identification (authentication) of human cell lines, stem cells, and tissues. The advantage of this technique over methods such as isoenzyme analysis, karyotyping, human leukocyte antigen typing, etc., is that STR profiling can establish identity to the individual level, provided that the appropriate number and types of loci are evaluated. To best employ this technology, a standardized protocol and a data-driven, quality-controlled, and publically searchable database will be necessary. This public STR database (currently under development) will enable investigators to rapidly authenticate human-based cultures to the individual from whom the cells were sourced. Use of similar approaches for non-human animal cells will require developing other suitable loci sets. While implementing STR analysis on a more routine basis should significantly reduce the frequency of cell misidentification, additional technologies may be needed as part of an overall authentication paradigm. For instance, isoenzyme analysis, PCR-based DNA amplification, and sequence-based barcoding methods enable rapid confirmation of a cell line's species of origin while screening against cross-contaminations, especially when the cells present are not recognized by the species-specific STR method. Karyotyping may also be needed as a supporting tool during establishment of an STR database. Finally, good cell culture practices must always remain a major component of any effort to reduce the frequency of cell misidentification.
Xu, Wei; Chen, Deying; Wang, Nan; Zhang, Ting; Zhou, Ruokun; Huan, Tao; Lu, Yingfeng; Su, Xiaoling; Xie, Qing; Li, Liang; Li, Lanjuan
2015-01-20
Human fecal samples contain endogenous human metabolites, gut microbiota metabolites, and other compounds. Profiling the fecal metabolome can produce metabolic information that may be used not only for disease biomarker discovery, but also for providing an insight about the relationship of the gut microbiome and human health. In this work, we report a chemical isotope labeling liquid chromatography-mass spectrometry (LC-MS) method for comprehensive and quantitative analysis of the amine- and phenol-containing metabolites in fecal samples. Differential (13)C2/(12)C2-dansyl labeling of the amines and phenols was used to improve LC separation efficiency and MS detection sensitivity. Water, methanol, and acetonitrile were examined as an extraction solvent, and a sequential water-acetonitrile extraction method was found to be optimal. A step-gradient LC-UV setup and a fast LC-MS method were evaluated for measuring the total concentration of dansyl labeled metabolites that could be used for normalizing the sample amounts of individual samples for quantitative metabolomics. Knowing the total concentration was also useful for optimizing the sample injection amount into LC-MS to maximize the number of metabolites detectable while avoiding sample overloading. For the first time, dansylation isotope labeling LC-MS was performed in a simple time-of-flight mass spectrometer, instead of high-end equipment, demonstrating the feasibility of using a low-cost instrument for chemical isotope labeling metabolomics. The developed method was applied for profiling the amine/phenol submetabolome of fecal samples collected from three families. An average of 1785 peak pairs or putative metabolites were found from a 30 min LC-MS run. From 243 LC-MS runs of all the fecal samples, a total of 6200 peak pairs were detected. Among them, 67 could be positively identified based on the mass and retention time match to a dansyl standard library, while 581 and 3197 peak pairs could be putatively identified based on mass match using MyCompoundID against a Human Metabolome Database and an Evidence-based Metabolome Library, respectively. This represents the most comprehensive profile of the amine/phenol submetabolome ever detected in human fecal samples. The quantitative metabolome profiles of individual samples were shown to be useful to separate different groups of samples, illustrating the possibility of using this method for fecal metabolomics studies.
Profile of Pre-Service Science Teachers Based on STEM Career Interest Survey
NASA Astrophysics Data System (ADS)
Winarno, N.; Widodo, A.; Rusdiana, D.; Rochintaniawati, D.; Afifah, R. M. A.
2017-09-01
This study aims to investigate the profile of pre-service science teachers based on STEM (Science, Technology, Engineering, and Mathematics) Career Interest Survey. The study uses descriptive survey method as the research design. Samples collected from 66 preservice science teachers in a university located in Bandung, Indonesia. The results of the study are the profile of pre-service science teachers based on STEM Career Interest Survey shows that the average number of career interest in the field of technology is 4.08, in science 3.80, mathematics 3.39 and engineering 3.30. Pre-service science teachers are found to have interests in the STEM career fields. This research is necessary as there are many instances of people choosing majors or studies that are not in accordance with their interests and talents. The recommendation of this study is to develop learning in pre-service science teachers by using STEM approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Wang, Jiyu; Mulcahy, David
This paper presents a voltage-load sensitivity matrix (VLSM) based voltage control method to deploy demand response resources for controlling voltage in high solar penetration distribution feeders. The IEEE 123-bus system in OpenDSS is used for testing the performance of the preliminary VLSM-based voltage control approach. A load disaggregation process is applied to disaggregate the total load profile at the feeder head to each load nodes along the feeder so that loads are modeled at residential house level. Measured solar generation profiles are used in the simulation to model the impact of solar power on distribution feeder voltage profiles. Different casemore » studies involving various PV penetration levels and installation locations have been performed. Simulation results show that the VLSM algorithm performance meets the voltage control requirements and is an effective voltage control strategy.« less
Study on numerical simulation of asymmetric structure aluminum profile extrusion based on ALE method
NASA Astrophysics Data System (ADS)
Chen, Kun; Qu, Yuan; Ding, Siyi; Liu, Changhui; Yang, Fuyong
2018-05-01
Using the HyperXtrude module based on the Arbitrary Lagrangian-Eulerian (ALE) finite element method, the paper simulates the steady extrusion process of the asymmetric structure aluminum die successfully. A verification experiment is carried out to verify the simulation results. Having obtained and analyzed the stress-strain field, temperature field and extruded velocity of the metal, it confirms that the simulation prediction results and the experimental schemes are consistent. The scheme of the die correction and optimization are discussed at last. By adjusting the bearing length and core thickness, adopting the structure of feeder plate protection, short shunt bridge in the upper die and three-level bonding container in the lower die to control the metal flowing, the qualified aluminum profile can be obtained.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
NASA Astrophysics Data System (ADS)
Seraji, Faramarz E.; Rashidi, Mahnaz; Khasheie, Vajieh
2006-08-01
Photonic crystal fibers (PCFs) with a stepped raised-core profile and one layer equally spaced holes in the cladding are analyzed. Using effective index method and considering a raised step refractive index difference between the index of the core and the effective index of the cladding, we improve the characteristic parameters such as numerical aperture and V-parameter, and reduce its bending loss to about one tenth of a conventional PCF. Implementing such a structure in PCFs may be one step forward to achieve low loss PCFs for communication applications.
The analysis of professional competencies of a lecturer in adult education.
Žeravíková, Iveta; Tirpáková, Anna; Markechová, Dagmar
2015-01-01
In this article, we present the andragogical research project and evaluation of its results using nonparametric statistical methods and the semantic differential method. The presented research was realized in the years 2012-2013 in the dissertation of I. Žeravíková: Analysis of professional competencies of lecturer and creating his competence profile (Žeravíková 2013), and its purpose was based on the analysis of work activities of a lecturer to identify his most important professional competencies and to create a suggestion of competence profile of a lecturer in adult education.
New geometric design consistency model based on operating speed profiles for road safety evaluation.
Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo
2013-12-01
To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment. Copyright © 2012 Elsevier Ltd. All rights reserved.
Sieracki, M E; Reichenbach, S E; Webb, K L
1989-01-01
The accurate measurement of bacterial and protistan cell biomass is necessary for understanding their population and trophic dynamics in nature. Direct measurement of fluorescently stained cells is often the method of choice. The tedium of making such measurements visually on the large numbers of cells required has prompted the use of automatic image analysis for this purpose. Accurate measurements by image analysis require an accurate, reliable method of segmenting the image, that is, distinguishing the brightly fluorescing cells from a dark background. This is commonly done by visually choosing a threshold intensity value which most closely coincides with the outline of the cells as perceived by the operator. Ideally, an automated method based on the cell image characteristics should be used. Since the optical nature of edges in images of light-emitting, microscopic fluorescent objects is different from that of images generated by transmitted or reflected light, it seemed that automatic segmentation of such images may require special considerations. We tested nine automated threshold selection methods using standard fluorescent microspheres ranging in size and fluorescence intensity and fluorochrome-stained samples of cells from cultures of cyanobacteria, flagellates, and ciliates. The methods included several variations based on the maximum intensity gradient of the sphere profile (first derivative), the minimum in the second derivative of the sphere profile, the minimum of the image histogram, and the midpoint intensity. Our results indicated that thresholds determined visually and by first-derivative methods tended to overestimate the threshold, causing an underestimation of microsphere size. The method based on the minimum of the second derivative of the profile yielded the most accurate area estimates for spheres of different sizes and brightnesses and for four of the five cell types tested. A simple model of the optical properties of fluorescing objects and the video acquisition system is described which explains how the second derivative best approximates the position of the edge. Images PMID:2516431
Source-to-incident-flux relation in a Tokamak blanket module
NASA Astrophysics Data System (ADS)
Imel, G. R.
The next-generation Tokamak experiments, including the Tokamak fusion test reactor (TFTR), will utilize small blanket modules to measure performance parameters such as tritium breeding profiles, power deposition profiles, and neutron flux profiles. Specifically, a neutron calorimeter (simply a neutron moderating blanket module) which permits inferring the incident 14 MeV flux based on measured temperature profiles was proposed for TFTR. The problem of how to relate this total scalar flux to the fusion neutron source is addressed. This relation is necessary since the calorimeter is proposed as a total fusion energy monitor. The methods and assumptions presented was valid for the TFTR Lithium Breeding Module (LBM), as well as other modules on larger Tokamak reactors.
Random-Profiles-Based 3D Face Recognition System
Joongrock, Kim; Sunjin, Yu; Sangyoun, Lee
2014-01-01
In this paper, a noble nonintrusive three-dimensional (3D) face modeling system for random-profile-based 3D face recognition is presented. Although recent two-dimensional (2D) face recognition systems can achieve a reliable recognition rate under certain conditions, their performance is limited by internal and external changes, such as illumination and pose variation. To address these issues, 3D face recognition, which uses 3D face data, has recently received much attention. However, the performance of 3D face recognition highly depends on the precision of acquired 3D face data, while also requiring more computational power and storage capacity than 2D face recognition systems. In this paper, we present a developed nonintrusive 3D face modeling system composed of a stereo vision system and an invisible near-infrared line laser, which can be directly applied to profile-based 3D face recognition. We further propose a novel random-profile-based 3D face recognition method that is memory-efficient and pose-invariant. The experimental results demonstrate that the reconstructed 3D face data consists of more than 50 k 3D point clouds and a reliable recognition rate against pose variation. PMID:24691101
Mass spectrometry-based cDNA profiling as a potential tool for human body fluid identification.
Donfack, Joseph; Wiley, Anissa
2015-05-01
Several mRNA markers have been exhaustively evaluated for the identification of human venous blood, saliva, and semen in forensic genetics. As new candidate human body fluid specific markers are discovered, evaluated, and reported in the scientific literature, there is an increasing trend toward determining the ideal markers for cDNA profiling of body fluids of forensic interest. However, it has not been determined which molecular genetics-based technique(s) should be utilized to assess the performance of these markers. In recent years, only a few confirmatory, mRNA/cDNA-based methods have been evaluated for applications in body fluid identification. The most frequently described methods tested to date include quantitative polymerase chain reaction (qPCR) and capillary electrophoresis (CE). However these methods, in particular qPCR, often favor narrow multiplex PCR due to the availability of a limited number of fluorescent dyes/tags. In an attempt to address this technological constraint, this study explored matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) for human body fluid identification via cDNA profiling of venous blood, saliva, and semen. Using cDNA samples at 20pg input phosphoglycerate kinase 1 (PGK1) amounts, body fluid specific markers for the candidate genes were amplified in their corresponding body fluid (i.e., venous blood, saliva, or semen) and absent in the remaining two (100% specificity). The results of this study provide an initial indication that MALDI-TOF MS is a potential fluorescent dye-free alternative method for body fluid identification in forensic casework. However, the inherent issues of low amounts of mRNA, and the damage caused to mRNA by environmental exposures, extraction processes, and storage conditions are important factors that significantly hinder the implementation of cDNA profiling into forensic casework. Published by Elsevier Ireland Ltd.
Watanabe, Manabu; Kusano, Junko; Ohtaki, Shinsaku; Ishikura, Takashi; Katayama, Jin; Koguchi, Akira; Paumen, Michael; Hayashi, Yoshiharu
2014-09-01
Combining single-cell methods and next-generation sequencing should provide a powerful means to understand single-cell biology and obviate the effects of sample heterogeneity. Here we report a single-cell identification method and seamless cancer gene profiling using semiconductor-based massively parallel sequencing. A549 cells (adenocarcinomic human alveolar basal epithelial cell line) were used as a model. Single-cell capture was performed using laser capture microdissection (LCM) with an Arcturus® XT system, and a captured single cell and a bulk population of A549 cells (≈ 10(6) cells) were subjected to whole genome amplification (WGA). For cell identification, a multiplex PCR method (AmpliSeq™ SNP HID panel) was used to enrich 136 highly discriminatory SNPs with a genotype concordance probability of 10(31-35). For cancer gene profiling, we used mutation profiling that was performed in parallel using a hotspot panel for 50 cancer-related genes. Sequencing was performed using a semiconductor-based bench top sequencer. The distribution of sequence reads for both HID and Cancer panel amplicons was consistent across these samples. For the bulk population of cells, the percentages of sequence covered at coverage of more than 100 × were 99.04% for the HID panel and 98.83% for the Cancer panel, while for the single cell percentages of sequence covered at coverage of more than 100 × were 55.93% for the HID panel and 65.96% for the Cancer panel. Partial amplification failure or randomly distributed non-amplified regions across samples from single cells during the WGA procedures or random allele drop out probably caused these differences. However, comparative analyses showed that this method successfully discriminated a single A549 cancer cell from a bulk population of A549 cells. Thus, our approach provides a powerful means to overcome tumor sample heterogeneity when searching for somatic mutations.
Lyng, Maria B; Kodahl, Annette R; Binder, Harald; Ditzel, Henrik J
2016-12-01
Mammography is the predominant screening method for early detection of breast cancer, but has limitations and could be rendered more accurate by combination with a blood-based biomarker profile. Circulating microRNAs (miRNAs) are increasingly recognized as strong biomarkers, and we previously developed a 9-miRNA profile using serum and LNA-based qPCR that effectively stratified patients with early stage breast cancer vs. healthy women. To further develop the test into routine clinical practice, we collected serum of women examined by clinical mammography (N = 197) according to standard operational procedures (SOPs) of the Danish Cancer Biobank. The performance of the circulating 9-miRNA profile was analyzed in 116 of these women, including 36 with breast cancer (aged 50-74), following a standardized protocol that mimicked a routine clinical set-up. We confirmed that the profile is significantly different between women with breast cancer and controls (p-value <0.0001), with an AUC of 0.61. Significantly, one woman whose 9-miRNA profile predicted a 73% probability of having breast cancer indeed developed the disease within one year despite being categorized as clinically healthy at the time of blood sample collection and mammography. We propose that this miRNA profile combined with mammography will increase the overall accuracy of early detection of breast cancer. Copyright © 2016 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pätzold, M.; Bird, M. K.; Häusler, B.; Peter, K.; Tellmann, S.; Tyler, G. L.
2016-10-01
In their recent paper, Grandin et al. (2014) claim to have developed a novel approach, principally a ray tracing method, to analyze radio sounding data from occulted spacecraft signals by planetary atmospheres without the usual assumptions of the radio occultation inversion method of a stratified, layered, symmetric atmosphere. They apply their "new approach" to observations of the Mars Express Radio Science (MaRS) experiment and compare their resulting temperature, neutral number density, and electron density profiles with those from MaRS, claiming that there is good agreement with the observations. The fact is, however, that there are serious disagreements in the most important altitude ranges. Their temperature profile shows a 30 K shift or a 300σ (1σ standard deviation = 0.1 K for the MaRS profile near the surface) difference toward warmer temperatures at the surface when compared with MaRS, while the MaRS profile is in best agreement with the profile from the Mars Climate Data Base V5.0 (MCD V5.0). Their full temperature profile from the surface to 250 km altitude deviates significantly from the MCD V5.0 profile. Their ionospheric electron density profile is considerably different from that derived from the MaRs observations. Although Grandin et al. (2014) claim to derive the neutral number density and temperature profiles above 200 km, including the asymptotic exosphere temperature, it is simply not possible to derive this information from what is essentially noise.
NASA Astrophysics Data System (ADS)
Pascoe, D. J.; Anfinogentov, S. A.; Goddard, C. R.; Nakariakov, V. M.
2018-06-01
The shape of the damping profile of kink oscillations in coronal loops has recently allowed the transverse density profile of the loop to be estimated. This requires accurate measurement of the damping profile that can distinguish the Gaussian and exponential damping regimes, otherwise there are more unknowns than observables. Forward modeling of the transverse intensity profile may also be used to estimate the width of the inhomogeneous layer of a loop, providing an independent estimate of one of these unknowns. We analyze an oscillating loop for which the seismological determination of the transverse structure is inconclusive except when supplemented by additional spatial information from the transverse intensity profile. Our temporal analysis describes the motion of a coronal loop as a kink oscillation damped by resonant absorption, and our spatial analysis is based on forward modeling the transverse EUV intensity profile of the loop under the isothermal and optically thin approximations. We use Bayesian analysis and Markov chain Monte Carlo sampling to apply our spatial and temporal models both individually and simultaneously to our data and compare the results with numerical simulations. Combining the two methods allows both the inhomogeneous layer width and density contrast to be calculated, which is not possible for the same data when each method is applied individually. We demonstrate that the assumption of an exponential damping profile leads to a significantly larger error in the inferred density contrast ratio compared with a Gaussian damping profile.
Quantum Dot Platform for Single-Cell Molecular Profiling
NASA Astrophysics Data System (ADS)
Zrazhevskiy, Pavel S.
In-depth understanding of the nature of cell physiology and ability to diagnose and control the progression of pathological processes heavily rely on untangling the complexity of intracellular molecular mechanisms and pathways. Therefore, comprehensive molecular profiling of individual cells within the context of their natural tissue or cell culture microenvironment is essential. In principle, this goal can be achieved by tagging each molecular target with a unique reporter probe and detecting its localization with high sensitivity at sub-cellular resolution, primarily via microscopy-based imaging. Yet, neither widely used conventional methods nor more advanced nanoparticle-based techniques have been able to address this task up to date. High multiplexing potential of fluorescent probes is heavily restrained by the inability to uniquely match probes with corresponding molecular targets. This issue is especially relevant for quantum dot probes---while simultaneous spectral imaging of up to 10 different probes is possible, only few can be used concurrently for staining with existing methods. To fully utilize multiplexing potential of quantum dots, it is necessary to design a new staining platform featuring unique assignment of each target to a corresponding quantum dot probe. This dissertation presents two complementary versatile approaches towards achieving comprehensive single-cell molecular profiling and describes engineering of quantum dot probes specifically tailored for each staining method. Analysis of expanded molecular profiles is achieved through augmenting parallel multiplexing capacity with performing several staining cycles on the same specimen in sequential manner. In contrast to other methods utilizing quantum dots or other nanoparticles, which often involve sophisticated probe synthesis, the platform technology presented here takes advantage of simple covalent bioconjugation and non-covalent self-assembly mechanisms for straightforward probe preparation and specimen labeling, requiring no advanced technical skills and being directly applicable for a wide range of molecular profiling studies. Utilization of quantum dot platform for single-cell molecular profiling promises to greatly benefit both biomedical research and clinical diagnostics by providing a tool for addressing phenotypic heterogeneity within large cell populations, opening access to studying low-abundance events often masked or completely erased by batch processing, and elucidating biomarker signatures of diseases critical for accurate diagnostics and targeted therapy.
Sub-microradian Surface Slope Metrology with the ALS Developmental Long Trace Profiler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, Valeriy V.; Barber, Samuel; Domning, Edward E.
2009-06-15
Development of X-ray optics for 3rd and 4th generation X-ray light sources with a level of surface slope precision of 0.1-0.2 {micro}rad requires the development of adequate fabrication technologies and dedicated metrology instrumentation and methods. Currently, the best performance of surface slope measurement has been achieved with the NOM (Nanometer Optical Component Measuring Machine) slope profiler at BESSY (Germany) [1] and the ESAD (Extended Shear Angle Difference) profiler at the PTB (Germany) [2]. Both instruments are based on electronic autocollimators (AC) precisely calibrated for the specific application [3] with small apertures of 2.5-5 mm in diameter. In the present work,more » we describe the design, initial alignment and calibration procedures, the instrumental control and data acquisition system, as well as the measurement performance of the Developmental Long Trace Profiler (DLTP) slope measuring instrument recently brought into operation at the Advanced Light Source (ALS) Optical Metrology Laboratory (OML). Similar to the NOM and ESAD, the DLTP is based on a precisely calibrated autocollimator. However, this is a reasonably low budget instrument used at the ALS OML for the development and testing of new measuring techniques and methods. Some of the developed methods have been implemented into the ALS LTP-II (slope measuring long trace profiler [4]) which was recently upgraded and has demonstrated a capability for 0.25 {micro}rad surface metrology [5]. Performance of the DLTP was verified via a number of measurements with high quality reference mirrors. A comparison with the corresponding results obtained with the world's best slope measuring instrument, the BESSY NOM, proves the accuracy of the DLTP measurements on the level of 0.1-0.2 {micro}rad depending on the curvature of a surface under test. The directions of future work to develop a surface slope measuring profiler with nano-radian performance are also discussed.« less
Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai
2016-08-26
Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.
Schorstein, Kai; Popescu, Alexandru; Göbel, Marco; Walther, Thomas
2008-01-01
Temperature profiles of the ocean are of interest for weather forecasts, climate studies and oceanography in general. Currently, mostly in situ techniques such as fixed buoys or bathythermographs deliver oceanic temperature profiles. A LIDAR method based on Brillouin scattering is an attractive alternative for remote sensing of such water temperature profiles. It makes it possible to deliver cost-effective on-line data covering an extended region of the ocean. The temperature measurement is based on spontaneous Brillouin scattering in water. In this contribution, we present the first water temperature measurements using a Yb:doped pulsed fiber amplifier. The fiber amplifier is a custom designed device which can be operated in a vibrational environment while emitting narrow bandwidth laser pulses. The device shows promising performance and demonstrates the feasibility of this approach. Furthermore, the current status of the receiver is briefly discussed; it is based on an excited state Faraday anomalous dispersion optical filter. PMID:27873842
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xue, Haizhou; Zhang, Yanwen; Zhu, Zihua
Single crystalline 6H-SiC samples were irradiated at 150 K using 2MeV Pt ions. Local volume swelling is determined by electron energy loss spectroscopy (EELS), a nearly sigmoidal dependence with irradiation dose is observed. The disorder profiles and ion distribution are determined by Rutherford backscattering spectrometry (RBS), transmission electron microscopy and secondary ion mass spectrum. Since the volume swelling reaches 12% over the damage region under high ion fluence, lattice expansion is considered and corrected during the data analysis of RBS spectra to obtain depth profiles. Projectile and damage profiles are estimated by SRIM (Stopping and Range of Ions in Matter).more » Comparing with the measured profiles, SRIM code significantly overestimates the electronic stopping power for the slow heavy Pt ions, and large derivations are observed in the predicted ion distribution and the damage profiles. Utilizing the reciprocity method that is based on the invariance of the inelastic excitation in ion atom collisions against interchange of projectile and target, much lower electronic stopping is deduced. A simple approach based on reducing the density of SiC target in SRIM simulation is proposed to compensate the overestimated SRIM electronic stopping power values. Better damage profile and ion range are predicted.« less
Compton profiles of some composite materials normalized by a new method
NASA Astrophysics Data System (ADS)
Sankarshan, B. M.; Umesh, T. K.
2018-03-01
Recently, we have shown that as a novel approach, in the case of samples which can be treated as pure incoherent scatterers, the effective atomic number Zeff itself could be conveniently used to normalize their un-normalized Compton profiles. In the present investigation, we have attempted to examine the efficacy of this approach. For this purpose, we have first determined the single differential Compton scattering cross sections (SDCS) of the elements C and Al as well as of some H, C, N and O based polymer samples such as bakelite, epoxy, nylon and teflon which are pure incoherent scatterers. The measurements were made at 120° in a goniometer assembly that employs a high resolution high purity germanium detector. The SDCS values were used to obtain the Zeff and the un-normalized Compton profiles. These Compton profiles were separately normalized with their Zeff values (for Compton scattering) as well as with the normalization constant obtained by integrating their Hartree-Fock Biggs et al Compton profiles based on the mixture rule. These two sets of values agreed well within the range of experimental errors, implying that Zeff can be conveniently used to normalize the experimental Compton profiles of pure incoherent scatterers.
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
NASA Astrophysics Data System (ADS)
Hazreek, Z. A. M.; Kamarudin, A. F.; Rosli, S.; Fauziah, A.; Akmal, M. A. K.; Aziman, M.; Azhar, A. T. S.; Ashraf, M. I. M.; Shaylinda, M. Z. N.; Rais, Y.; Ishak, M. F.; Alel, M. N. A.
2018-04-01
Geotechnical site investigation as known as subsurface profile evaluation is the process of subsurface layer characteristics determination which finally used for design and construction phase. Traditionally, site investigation was performed using drilling technique thus suffers from several limitation due to cost, time, data coverage and sustainability. In order to overcome those problems, this study adopted surface techniques using seismic refraction and ambient vibration method for subsurface profile depth evaluation. Seismic refraction data acquisition and processing was performed using ABEM Terraloc and OPTIM software respectively. Meanwhile ambient vibration data acquisition and processing was performed using CityShark II, Lennartz and GEOPSY software respectively. It was found that studied area consist of two layers representing overburden and bedrock geomaterials based on p-wave velocity value (vp = 300 – 2500 m/s and vp > 2500 m/s) and natural frequency value (Fo = 3.37 – 3.90 Hz) analyzed. Further analysis found that both methods show some good similarity in term of depth and thickness with percentage accuracy at 60 – 97%. Consequently, this study has demonstrated that the application of seismic refractin and ambient vibration method was applicable in subsurface profile depth and thickness estimation. Moreover, surface technique which consider as non-destructive method adopted in this study was able to compliment conventional drilling method in term of cost, time, data coverage and environmental sustainaibility.
Parameterization of cloud lidar backscattering profiles by means of asymmetrical Gaussians
NASA Astrophysics Data System (ADS)
del Guasta, Massimo; Morandi, Marco; Stefanutti, Leopoldo
1995-06-01
A fitting procedure for cloud lidar data processing is shown that is based on the computation of the first three moments of the vertical-backscattering (or -extinction) profile. Single-peak clouds or single cloud layers are approximated to asymmetrical Gaussians. The algorithm is particularly stable with respect to noise and processing errors, and it is much faster than the equivalent least-squares approach. Multilayer clouds can easily be treated as a sum of single asymmetrical Gaussian peaks. The method is suitable for cloud-shape parametrization in noisy lidar signatures (like those expected from satellite lidars). It also permits an improvement of cloud radiative-property computations that are based on huge lidar data sets for which storage and careful examination of single lidar profiles can't be carried out.
Mentasti, Massimo; Tewolde, Rediat; Aslett, Martin; Harris, Simon R.; Afshar, Baharak; Underwood, Anthony; Harrison, Timothy G.
2016-01-01
Sequence-based typing (SBT), analogous to multilocus sequence typing (MLST), is the current “gold standard” typing method for investigation of legionellosis outbreaks caused by Legionella pneumophila. However, as common sequence types (STs) cause many infections, some investigations remain unresolved. In this study, various whole-genome sequencing (WGS)-based methods were evaluated according to published guidelines, including (i) a single nucleotide polymorphism (SNP)-based method, (ii) extended MLST using different numbers of genes, (iii) determination of gene presence or absence, and (iv) a kmer-based method. L. pneumophila serogroup 1 isolates (n = 106) from the standard “typing panel,” previously used by the European Society for Clinical Microbiology Study Group on Legionella Infections (ESGLI), were tested together with another 229 isolates. Over 98% of isolates were considered typeable using the SNP- and kmer-based methods. Percentages of isolates with complete extended MLST profiles ranged from 99.1% (50 genes) to 86.8% (1,455 genes), while only 41.5% produced a full profile with the gene presence/absence scheme. Replicates demonstrated that all methods offer 100% reproducibility. Indices of discrimination range from 0.972 (ribosomal MLST) to 0.999 (SNP based), and all values were higher than that achieved with SBT (0.940). Epidemiological concordance is generally inversely related to discriminatory power. We propose that an extended MLST scheme with ∼50 genes provides optimal epidemiological concordance while substantially improving the discrimination offered by SBT and can be used as part of a hierarchical typing scheme that should maintain backwards compatibility and increase discrimination where necessary. This analysis will be useful for the ESGLI to design a scheme that has the potential to become the new gold standard typing method for L. pneumophila. PMID:27280420
David, Sophia; Mentasti, Massimo; Tewolde, Rediat; Aslett, Martin; Harris, Simon R; Afshar, Baharak; Underwood, Anthony; Fry, Norman K; Parkhill, Julian; Harrison, Timothy G
2016-08-01
Sequence-based typing (SBT), analogous to multilocus sequence typing (MLST), is the current "gold standard" typing method for investigation of legionellosis outbreaks caused by Legionella pneumophila However, as common sequence types (STs) cause many infections, some investigations remain unresolved. In this study, various whole-genome sequencing (WGS)-based methods were evaluated according to published guidelines, including (i) a single nucleotide polymorphism (SNP)-based method, (ii) extended MLST using different numbers of genes, (iii) determination of gene presence or absence, and (iv) a kmer-based method. L. pneumophila serogroup 1 isolates (n = 106) from the standard "typing panel," previously used by the European Society for Clinical Microbiology Study Group on Legionella Infections (ESGLI), were tested together with another 229 isolates. Over 98% of isolates were considered typeable using the SNP- and kmer-based methods. Percentages of isolates with complete extended MLST profiles ranged from 99.1% (50 genes) to 86.8% (1,455 genes), while only 41.5% produced a full profile with the gene presence/absence scheme. Replicates demonstrated that all methods offer 100% reproducibility. Indices of discrimination range from 0.972 (ribosomal MLST) to 0.999 (SNP based), and all values were higher than that achieved with SBT (0.940). Epidemiological concordance is generally inversely related to discriminatory power. We propose that an extended MLST scheme with ∼50 genes provides optimal epidemiological concordance while substantially improving the discrimination offered by SBT and can be used as part of a hierarchical typing scheme that should maintain backwards compatibility and increase discrimination where necessary. This analysis will be useful for the ESGLI to design a scheme that has the potential to become the new gold standard typing method for L. pneumophila. Copyright © 2016 David et al.
Luo, Xia; Jellison, Kristen L; Huynh, Kevin; Widmer, Giovanni
2015-01-01
Multiple rotating annular reactors were seeded with biofilms flushed from water distribution systems to assess (1) whether biofilms grown in bioreactors are representative of biofilms flushed from the water distribution system in terms of bacterial composition and diversity, and (2) whether the biofilm sampling method affects the population profile of the attached bacterial community. Biofilms were grown in bioreactors until thickness stabilized (9 to 11 weeks) and harvested from reactor coupons by sonication, stomaching, bead-beating, and manual scraping. High-throughput sequencing of 16S rRNA amplicons was used to profile bacterial populations from flushed biofilms seeded into bioreactors as well as biofilms recovered from bioreactor coupons by different methods. β diversity between flushed and reactor biofilms was compared to β diversity between (i) biofilms harvested from different reactors and (ii) biofilms harvested by different methods from the same reactor. These analyses showed that average diversity between flushed and bioreactor biofilms was double the diversity between biofilms from different reactors operated in parallel. The diversity between bioreactors was larger than the diversity associated with different biofilm recovery methods. Compared to other experimental variables, the method used to recover biofilms had a negligible impact on the outcome of water biofilm analyses based on 16S amplicon sequencing. Results from this study show that biofilms grown in reactors over 9 to 11 weeks are not representative models of the microbial populations flushed from a distribution system. Furthermore, the bacterial population profile of biofilms grown in replicate reactors from the same flushed water are likely to diverge. However, four common sampling protocols, which differ with respect to disruption of bacterial cells, provide similar information with respect to the 16S rRNA population profile of the biofilm community.
Klein-Júnior, Luiz C; Viaene, Johan; Salton, Juliana; Koetz, Mariana; Gasper, André L; Henriques, Amélia T; Vander Heyden, Yvan
2016-09-09
Extraction methods evaluation to access plants metabolome is usually performed visually, lacking a truthful method of data handling. In the present study the major aim was developing reliable time- and solvent-saving extraction and fractionation methods to access alkaloid profiling of Psychotria nemorosa leaves. Ultrasound assisted extraction was selected as extraction method. Determined from a Fractional Factorial Design (FFD) approach, yield, sum of peak areas, and peak numbers were rather meaningless responses. However, Euclidean distance calculations between the UPLC-DAD metabolic profiles and the blank injection evidenced the extracts are highly diverse. Coupled with the calculation and plotting of effects per time point, it was possible to indicate thermolabile peaks. After screening, time and temperature were selected for optimization, while plant:solvent ratio was set at 1:50 (m/v), number of extractions at one and particle size at ≤180μm. From Central Composite Design (CCD) results modeling heights of important peaks, previously indicated by the FFD metabolic profile analysis, time was set at 65min and temperature at 45°C, thus avoiding degradation. For the fractionation step, a solid phase extraction method was optimized by a Box-Behnken Design (BBD) approach using the sum of peak areas as response. Sample concentration was consequently set at 150mg/mL, % acetonitrile in dichloromethane at 40% as eluting solvent, and eluting volume at 30mL. Summarized, the Euclidean distance and the metabolite profiles provided significant responses for accessing P. nemorosa alkaloids, allowing developing reliable extraction and fractionation methods, avoiding degradation and decreasing the required time and solvent volume. Copyright © 2016 Elsevier B.V. All rights reserved.
Systems and methods for process and user driven dynamic voltage and frequency scaling
Mallik, Arindam [Evanston, IL; Lin, Bin [Hillsboro, OR; Memik, Gokhan [Evanston, IL; Dinda, Peter [Evanston, IL; Dick, Robert [Evanston, IL
2011-03-22
Certain embodiments of the present invention provide a method for power management including determining at least one of an operating frequency and an operating voltage for a processor and configuring the processor based on the determined at least one of the operating frequency and the operating voltage. The operating frequency is determined based at least in part on direct user input. The operating voltage is determined based at least in part on an individual profile for processor.
Inversion of residual stress profiles from ultrasonic Rayleigh wave dispersion data
NASA Astrophysics Data System (ADS)
Mora, P.; Spies, M.
2018-05-01
We investigate theoretically and with synthetic data the performance of several inversion methods to infer a residual stress state from ultrasonic surface wave dispersion data. We show that this particular problem may reveal in relevant materials undesired behaviors for some methods that could be reliably applied to infer other properties. We focus on two methods, one based on a Taylor-expansion, and another one based on a piecewise linear expansion regularized by a singular value decomposition. We explain the instabilities of the Taylor-based method by highlighting singularities in the series of coefficients. At the same time, we show that the other method can successfully provide performances which only weakly depend on the material.
Paper-Based MicroRNA Expression Profiling from Plasma and Circulating Tumor Cells.
Leong, Sai Mun; Tan, Karen Mei-Ling; Chua, Hui Wen; Huang, Mo-Chao; Cheong, Wai Chye; Li, Mo-Huang; Tucker, Steven; Koay, Evelyn Siew-Chuan
2017-03-01
Molecular characterization of circulating tumor cells (CTCs) holds great promise for monitoring metastatic progression and characterizing metastatic disease. However, leukocyte and red blood cell contamination of routinely isolated CTCs makes CTC-specific molecular characterization extremely challenging. Here we report the use of a paper-based medium for efficient extraction of microRNAs (miRNAs) from limited amounts of biological samples such as rare CTCs harvested from cancer patient blood. Specifically, we devised a workflow involving the use of Flinders Technology Associates (FTA) ® Elute Card with a digital PCR-inspired "partitioning" method to extract and purify miRNAs from plasma and CTCs. We demonstrated the sensitivity of this method to detect miRNA expression from as few as 3 cancer cells spiked into human blood. Using this method, background miRNA expression was excluded from contaminating blood cells, and CTC-specific miRNA expression profiles were derived from breast and colorectal cancer patients. Plasma separated out during purification of CTCs could likewise be processed using the same paper-based method for miRNA detection, thereby maximizing the amount of patient-specific information that can be derived from a single blood draw. Overall, this paper-based extraction method enables an efficient, cost-effective workflow for maximized recovery of small RNAs from limited biological samples for downstream molecular analyses. © 2016 American Association for Clinical Chemistry.
Pre-analytical method for NMR-based grape metabolic fingerprinting and chemometrics.
Ali, Kashif; Maltese, Federica; Fortes, Ana Margarida; Pais, Maria Salomé; Verpoorte, Robert; Choi, Young Hae
2011-10-10
Although metabolomics aims at profiling all the metabolites in organisms, data quality is quite dependent on the pre-analytical methods employed. In order to evaluate current methods, different pre-analytical methods were compared and used for the metabolic profiling of grapevine as a model plant. Five grape cultivars from Portugal in combination with chemometrics were analyzed in this study. A common extraction method with deuterated water and methanol was found effective in the case of amino acids, organic acids, and sugars. For secondary metabolites like phenolics, solid phase extraction with C-18 cartridges showed good results. Principal component analysis, in combination with NMR spectroscopy, was applied and showed clear distinction among the cultivars. Primary metabolites such as choline, sucrose, and leucine were found discriminating for 'Alvarinho', while elevated levels of alanine, valine, and acetate were found in 'Arinto' (white varieties). Among the red cultivars, higher signals for citrate and GABA in 'Touriga Nacional', succinate and fumarate in 'Aragonês', and malate, ascorbate, fructose and glucose in 'Trincadeira', were observed. Based on the phenolic profile, 'Arinto' was found with higher levels of phenolics as compared to 'Alvarinho'. 'Trincadeira' showed lowest phenolics content while higher levels of flavonoids and phenylpropanoids were found in 'Aragonês' and 'Touriga Nacional', respectively. It is shown that the metabolite composition of the extract is highly affected by the extraction procedure and this consideration has to be taken in account for metabolomics studies. Copyright © 2011 Elsevier B.V. All rights reserved.
Zhang, Jianguo; Zhang, Kai; Yang, Yuanyuan; Sun, Jianyong; Ling, Tonghui; Wang, Mingqing; Bak, Peter
2015-01-01
Abstract. IHE XDS-I profile proposes an architecture model for cross-enterprise medical image sharing, but there are only a few clinical implementations reported. Here, we investigate three pilot studies based on the IHE XDS-I profile to see whether we can use this architecture as a foundation for image sharing solutions in a variety of health-care settings. The first pilot study was image sharing for cross-enterprise health care with federated integration, which was implemented in Huadong Hospital and Shanghai Sixth People’s Hospital within the Shanghai Shen-Kang Hospital Management Center; the second pilot study was XDS-I–based patient-controlled image sharing solution, which was implemented by the Radiological Society of North America (RSNA) team in the USA; and the third pilot study was collaborative imaging diagnosis with electronic health-care record integration in regional health care, which was implemented in two districts in Shanghai. In order to support these pilot studies, we designed and developed new image access methods, components, and data models such as RAD-69/WADO hybrid image retrieval, RSNA clearinghouse, and extension of metadata definitions in both the submission set and the cross-enterprise document sharing (XDS) registry. We identified several key issues that impact the implementation of XDS-I in practical applications, and conclude that the IHE XDS-I profile is a theoretically good architecture and a useful foundation for medical image sharing solutions across multiple regional health-care providers. PMID:26835497
Zhang, Jianguo; Zhang, Kai; Yang, Yuanyuan; Sun, Jianyong; Ling, Tonghui; Wang, Mingqing; Bak, Peter
2015-10-01
IHE XDS-I profile proposes an architecture model for cross-enterprise medical image sharing, but there are only a few clinical implementations reported. Here, we investigate three pilot studies based on the IHE XDS-I profile to see whether we can use this architecture as a foundation for image sharing solutions in a variety of health-care settings. The first pilot study was image sharing for cross-enterprise health care with federated integration, which was implemented in Huadong Hospital and Shanghai Sixth People's Hospital within the Shanghai Shen-Kang Hospital Management Center; the second pilot study was XDS-I-based patient-controlled image sharing solution, which was implemented by the Radiological Society of North America (RSNA) team in the USA; and the third pilot study was collaborative imaging diagnosis with electronic health-care record integration in regional health care, which was implemented in two districts in Shanghai. In order to support these pilot studies, we designed and developed new image access methods, components, and data models such as RAD-69/WADO hybrid image retrieval, RSNA clearinghouse, and extension of metadata definitions in both the submission set and the cross-enterprise document sharing (XDS) registry. We identified several key issues that impact the implementation of XDS-I in practical applications, and conclude that the IHE XDS-I profile is a theoretically good architecture and a useful foundation for medical image sharing solutions across multiple regional health-care providers.
Ultrasonic Method for Measuring Internal Temperature Profile in Heated Materials
NASA Astrophysics Data System (ADS)
Ihara, I.; Takahashi, M.
2008-02-01
A new ultrasonic method for internal temperature measurement is presented. The principle of the method is based on temperature dependence of the velocity of the ultrasonic wave propagating through the material. An inverse analysis to determine the temperature profile in a heated material is developed and an experiment is carried out to verify the validity of the developed method. A single side of a silicone rubber plate of 30 mm thickness is heated and ultrasonic pulse-echo measurements are then performed during heating. A change in transit time of ultrasonic wave in the heated rubber plate is monitored and used to determine the transient variation in internal temperature distribution of the rubber. The internal temperature distribution determined ultrasonically agrees well with both obtained using commercial thermocouples installed in the rubber and estimated theoretically.
Modeling Bivariate Longitudinal Hormone Profiles by Hierarchical State Space Models
Liu, Ziyue; Cappola, Anne R.; Crofford, Leslie J.; Guo, Wensheng
2013-01-01
The hypothalamic-pituitary-adrenal (HPA) axis is crucial in coping with stress and maintaining homeostasis. Hormones produced by the HPA axis exhibit both complex univariate longitudinal profiles and complex relationships among different hormones. Consequently, modeling these multivariate longitudinal hormone profiles is a challenging task. In this paper, we propose a bivariate hierarchical state space model, in which each hormone profile is modeled by a hierarchical state space model, with both population-average and subject-specific components. The bivariate model is constructed by concatenating the univariate models based on the hypothesized relationship. Because of the flexible framework of state space form, the resultant models not only can handle complex individual profiles, but also can incorporate complex relationships between two hormones, including both concurrent and feedback relationship. Estimation and inference are based on marginal likelihood and posterior means and variances. Computationally efficient Kalman filtering and smoothing algorithms are used for implementation. Application of the proposed method to a study of chronic fatigue syndrome and fibromyalgia reveals that the relationships between adrenocorticotropic hormone and cortisol in the patient group are weaker than in healthy controls. PMID:24729646
Modeling Bivariate Longitudinal Hormone Profiles by Hierarchical State Space Models.
Liu, Ziyue; Cappola, Anne R; Crofford, Leslie J; Guo, Wensheng
2014-01-01
The hypothalamic-pituitary-adrenal (HPA) axis is crucial in coping with stress and maintaining homeostasis. Hormones produced by the HPA axis exhibit both complex univariate longitudinal profiles and complex relationships among different hormones. Consequently, modeling these multivariate longitudinal hormone profiles is a challenging task. In this paper, we propose a bivariate hierarchical state space model, in which each hormone profile is modeled by a hierarchical state space model, with both population-average and subject-specific components. The bivariate model is constructed by concatenating the univariate models based on the hypothesized relationship. Because of the flexible framework of state space form, the resultant models not only can handle complex individual profiles, but also can incorporate complex relationships between two hormones, including both concurrent and feedback relationship. Estimation and inference are based on marginal likelihood and posterior means and variances. Computationally efficient Kalman filtering and smoothing algorithms are used for implementation. Application of the proposed method to a study of chronic fatigue syndrome and fibromyalgia reveals that the relationships between adrenocorticotropic hormone and cortisol in the patient group are weaker than in healthy controls.
Kronewitter, Scott R; An, Hyun Joo; de Leoz, Maria Lorna; Lebrilla, Carlito B; Miyamoto, Suzanne; Leiserowitz, Gary S
2009-06-01
Annotation of the human serum N-linked glycome is a formidable challenge but is necessary for disease marker discovery. A new theoretical glycan library was constructed and proposed to provide all possible glycan compositions in serum. It was developed based on established glycobiology and retrosynthetic state-transition networks. We find that at least 331 compositions are possible in the serum N-linked glycome. By pairing the theoretical glycan mass library with a high mass accuracy and high-resolution MS, human serum glycans were effectively profiled. Correct isotopic envelope deconvolution to monoisotopic masses and the high mass accuracy instruments drastically reduced the amount of false composition assignments. The high throughput capacity enabled by this library permitted the rapid glycan profiling of large control populations. With the use of the library, a human serum glycan mass profile was developed from 46 healthy individuals. This paper presents a theoretical N-linked glycan mass library that was used for accurate high-throughput human serum glycan profiling. Rapid methods for evaluating a patient's glycome are instrumental for studying glycan-based markers.
NASA Astrophysics Data System (ADS)
Arel, Ersin
2012-06-01
The infamous soils of Adapazari, Turkey, that failed extensively during the 46-s long magnitude 7.4 earthquake in 1999 have since been the subject of a research program. Boreholes, piezocone soundings and voluminous laboratory testing have enabled researchers to apply sophisticated methods to determine the soil profiles in the city using the existing database. This paper describes the use of the artificial neural network (ANN) model to predict the complex soil profiles of Adapazari, based on cone penetration test (CPT) results. More than 3236 field CPT readings have been collected from 117 soundings spread over an area of 26 km2. An attempt has been made to develop the ANN model using multilayer perceptrons trained with a feed-forward back-propagation algorithm. The results show that the ANN model is fairly accurate in predicting complex soil profiles. Soil identification using CPT test results has principally been based on the Robertson charts. Applying neural network systems using the chart offers a powerful and rapid route to reliable prediction of the soil profiles.
An Academic Library's Experience with Fee-Based Services.
ERIC Educational Resources Information Center
Hornbeck, Julia W.
1983-01-01
Profile of fee-based information services offered by the Information Exchange Center of Georgia Institute of Technology notes history and background, document delivery to commercial clients and on-campus faculty, online and manual literature searching, staff, cost analysis, fee schedule, operating methods, client relations, marketing, and current…
Accurate read-based metagenome characterization using a hierarchical suite of unique signatures
Freitas, Tracey Allen K.; Li, Po-E; Scholz, Matthew B.; Chain, Patrick S. G.
2015-01-01
A major challenge in the field of shotgun metagenomics is the accurate identification of organisms present within a microbial community, based on classification of short sequence reads. Though existing microbial community profiling methods have attempted to rapidly classify the millions of reads output from modern sequencers, the combination of incomplete databases, similarity among otherwise divergent genomes, errors and biases in sequencing technologies, and the large volumes of sequencing data required for metagenome sequencing has led to unacceptably high false discovery rates (FDR). Here, we present the application of a novel, gene-independent and signature-based metagenomic taxonomic profiling method with significantly and consistently smaller FDR than any other available method. Our algorithm circumvents false positives using a series of non-redundant signature databases and examines Genomic Origins Through Taxonomic CHAllenge (GOTTCHA). GOTTCHA was tested and validated on 20 synthetic and mock datasets ranging in community composition and complexity, was applied successfully to data generated from spiked environmental and clinical samples, and robustly demonstrates superior performance compared with other available tools. PMID:25765641
Global Profiling of Reactive Oxygen and Nitrogen Species in Biological Systems
Zielonka, Jacek; Zielonka, Monika; Sikora, Adam; Adamus, Jan; Joseph, Joy; Hardy, Micael; Ouari, Olivier; Dranka, Brian P.; Kalyanaraman, Balaraman
2012-01-01
Herein we describe a high-throughput fluorescence and HPLC-based methodology for global profiling of reactive oxygen and nitrogen species (ROS/RNS) in biological systems. The combined use of HPLC and fluorescence detection is key to successful implementation and validation of this methodology. Included here are methods to specifically detect and quantitate the products formed from interaction between the ROS/RNS species and the fluorogenic probes, as follows: superoxide using hydroethidine, peroxynitrite using boronate-based probes, nitric oxide-derived nitrosating species with 4,5-diaminofluorescein, and hydrogen peroxide and other oxidants using 10-acetyl-3,7-dihydroxyphenoxazine (Amplex® Red) with and without horseradish peroxidase, respectively. In this study, we demonstrate real-time monitoring of ROS/RNS in activated macrophages using high-throughput fluorescence and HPLC methods. This global profiling approach, simultaneous detection of multiple ROS/RNS products of fluorescent probes, developed in this study will be useful in unraveling the complex role of ROS/RNS in redox regulation, cell signaling, and cellular oxidative processes and in high-throughput screening of anti-inflammatory antioxidants. PMID:22139901
Analysis of Scattering from Archival Pulsar Data using a CLEAN-based Method
NASA Astrophysics Data System (ADS)
Tsai, -Wei, Jr.; Simonetti, John H.; Kavic, Michael
2017-02-01
In this work, we adopted a CLEAN-based method to determine the scatter time, τ, from archived pulsar profiles under both the thin screen and uniform medium scattering models and to calculate the scatter time frequency scale index α, where τ \\propto {ν }α . The value of α is -4.4, if a Kolmogorov spectrum of the interstellar medium turbulence is assumed. We deconvolved 1342 profiles from 347 pulsars over a broad range of frequencies and dispersion measures. In our survey, in the majority of cases the scattering effect was not significant compared to pulse profile widths. For a subset of 21 pulsars scattering at the lowest frequencies was large enough to be measured. Because reliable scatter time measurements were determined only for the lowest frequency, we were limited to using upper limits on scatter times at higher frequencies for the purpose of our scatter time frequency slope estimation. We scaled the deconvolved scatter time to 1 GHz assuming α =-4.4 and considered our results in the context of other observations which yielded a broad relation between scatter time and dispersion measure.
A portable intra-oral scanner based on sinusoidal pattern of fast phase-shifting
NASA Astrophysics Data System (ADS)
Jan, Chia-Ming; Lin, Ying-Chieh
2016-03-01
This paper presented our current research about the intra-oral scanner made by MIRDC. Utilizing the sinusoidal pattern for fast phase-shifting technique to deal with 3D digitalization of human dental surface profile, the development of pseudo-phase shifting digital projection can easily achieve one type of full-field scanning instead of the common technique of the laser line scanning. Based on traditional Moiré method, we adopt projecting fringes and retrieve phase reconstruction to forward phase unwrapping. The phase difference between the plane and object can be exactly calculated from the desired fringe images, and the surface profile of object was probably reconstructed by using the phase differences information directly. According to our algorithm of space mapping between projections and capturing orientation exchange of our intra-oral scanning configuration, the system we made certainly can be proved to achieve the required accuracy of +/-10μm to deal with intra-oral scanning on the basis of utilizing active triangulation method. The final purpose aimed to the scanning of object surface profile with its size about 10x10x10mm3.
NASA Technical Reports Server (NTRS)
Holmes, Thomas; Owe, Manfred; deJeu, Richard
2007-01-01
Two data sets of experimental field observations with a range of meteorological conditions are used to investigate the possibility of modeling near-surface soil temperature profiles in a bare soil. It is shown that commonly used heat flow methods that assume a constant ground heat flux can not be used to model the extreme variations in temperature that occur near the surface. This paper proposes a simple approach for modeling the surface soil temperature profiles from a single depth observation. This approach consists of two parts: 1) modeling an instantaneous ground flux profile based on net radiation and the ground heat flux at 5cm depth; 2) using this ground heat flux profile to extrapolate a single temperature observation to a continuous near surface temperature profile. The new model is validated with an independent data set from a different soil and under a range of meteorological conditions.
One- and two-dimensional dopant/carrier profiling for ULSI
NASA Astrophysics Data System (ADS)
Vandervorst, W.; Clarysse, T.; De Wolf, P.; Trenkler, T.; Hantschel, T.; Stephenson, R.; Janssens, T.
1998-11-01
Dopant/carrier profiles constitute the basis of the operation of a semiconductor device and thus play a decisive role in the performance of a transistor and are subjected to the same scaling laws as the other constituents of a modern semiconductor device and continuously evolve towards shallower and more complex configurations. This evolution has increased the demands on the profiling techniques in particular in terms of resolution and quantification such that a constant reevaluation and improvement of the tools is required. As no single technique provides all the necessary information (dopant distribution, electrical activation,..) with the requested spatial and depth resolution, the present paper attempts to provide an assessment of those tools which can be considered as the main metrology technologies for ULSI-applications. For 1D-dopant profiling secondary ion mass spectrometry (SIMS) has progressed towards a generally accepted tool meeting the requirements. For 1D-carrier profiling spreading resistance profiling and microwave surface impedance profiling are envisaged as the best choices but extra developments are required to promote them to routinely applicable methods. As no main metrology tool exist for 2D-dopant profiling, main emphasis is on 2D-carrier profiling tools based on scanning probe microscopy. Scanning spreading resistance (SSRM) and scanning capacitance microscopy (SCM) are the preferred methods although neither of them already meets all the requirements. Complementary information can be extracted from Nanopotentiometry which samples the device operation in more detail. Concurrent use of carrier profiling tools, Nanopotentiometry, analysis of device characteristics and simulations is required to provide a complete characterization of deep submicron devices.
Forensic discrimination of copper wire using trace element concentrations.
Dettman, Joshua R; Cassabaum, Alyssa A; Saunders, Christopher P; Snyder, Deanna L; Buscaglia, JoAnn
2014-08-19
Copper may be recovered as evidence in high-profile cases such as thefts and improvised explosive device incidents; comparison of copper samples from the crime scene and those associated with the subject of an investigation can provide probative associative evidence and investigative support. A solution-based inductively coupled plasma mass spectrometry method for measuring trace element concentrations in high-purity copper was developed using standard reference materials. The method was evaluated for its ability to use trace element profiles to statistically discriminate between copper samples considering the precision of the measurement and manufacturing processes. The discriminating power was estimated by comparing samples chosen on the basis of the copper refining and production process to represent the within-source (samples expected to be similar) and between-source (samples expected to be different) variability using multivariate parametric- and empirical-based data simulation models with bootstrap resampling. If the false exclusion rate is set to 5%, >90% of the copper samples can be correctly determined to originate from different sources using a parametric-based model and >87% with an empirical-based approach. These results demonstrate the potential utility of the developed method for the comparison of copper samples encountered as forensic evidence.
Determination of defect content and defect profile in semiconductor heterostructures
NASA Astrophysics Data System (ADS)
Zubiaga, A.; Garcia, J. A.; Plazaola, F.; Zúñiga-Pérez, J.; Muñoz-Sanjosé, V.
2011-01-01
In this article we present an overview of the technique to obtain the defects depth profile and width of a deposited layer and multilayer based on positron annihilation spectroscopy. In particular we apply the method to ZnO and ZnO/ZnCdO layers deposited on sapphire substrates. After introducing some terminology we first calculate the trend that the W/S parameters of the Doppler broadening measurements must follow, both in a qualitative and quantitative way. From this point we extend the results to calculate the width and defect profiles in deposited layer samples.
Feder, Idit; Duadi, Hamootal; Dreifuss, Tamar; Fixler, Dror
2016-10-01
Optical methods for detecting physiological state based on light-tissue interaction are noninvasive, inexpensive, simplistic, and thus very useful. The blood vessels in human tissue are the main cause of light absorbing and scattering. Therefore, the effect of blood vessels on light-tissue interactions is essential for optically detecting physiological tissue state, such as oxygen saturation, blood perfusion and blood pressure. We have previously suggested a new theoretical and experimental method for measuring the full scattering profile, which is the angular distribution of light intensity, of cylindrical tissues. In this work we will present experimental measurements of the full scattering profile of heterogenic cylindrical phantoms that include blood vessels. We show, for the first time that the vessel diameter influences the full scattering profile, and found higher reflection intensity for larger vessel diameters accordance to the shielding effect. For an increase of 60% in the vessel diameter the light intensity in the full scattering profile above 90° is between 9% to 40% higher, depending on the angle. By these results we claim that during respiration, when the blood-vessel diameter changes, it is essential to consider the blood-vessel diameter distribution in order to determine the optical path in tissues. A CT scan of the measured silicon-based phantoms. The phantoms contain the same blood volume in different blood-vessel diameters. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ground-based microwave radiometric remote sensing of the tropical atmosphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Yong.
1992-01-01
A partially developed 9-channel ground-based microwave radiometer for the Department of Meteorology at Penn State was completed and tested. Complementary units were added, corrections to both hardware and software were made, and system software was corrected and upgraded. Measurements from this radiometer were used to infer tropospheric temperature, water vapor and cloud liquid water. The various weighting functions at each of the 9 channels were calculated and analyzed to estimate the sensitivities of the brightness temperature to the desired atmospheric variables. The mathematical inversion problem, in a linear form, was viewed in terms of the theory of linear algebra. Severalmore » methods for solving the inversion problem were reviewed. Radiometric observations were conducted during the 1990 Tropical Cyclone Motion Experiment. The radiometer was installed on the island of Saipan in a tropical region. The radiometer was calibrated using tipping curve and radiosonde data as well as measurements of the radiation from a blackbody absorber. A linear statistical method was applied for the data inversion. The inversion coefficients in the equation were obtained using a large number of radiosonde profiles from Guam and a radiative transfer model. Retrievals were compared with those from local, Saipan, radiosonde measurements. Water vapor profiles, integrated water vapor, and integrated liquid water were retrieved successfully. For temperature profile retrievals, however, the radiometric measurements with experimental noises added no more profile information to the inversion than that they were determined mainly by the surface pressure measurements. A method was developed to derive the integrated water vapor and liquid water from combined radiometer and ceilometer measurements. Significant improvement on radiometric measurements of the integrated liquid water can be gained with this method.« less
Crawford, Megan R.; Chirinos, Diana A.; Iurcotta, Toni; Edinger, Jack D.; Wyatt, James K.; Manber, Rachel; Ong, Jason C.
2017-01-01
Study Objectives: This study examined empirically derived symptom cluster profiles among patients who present with insomnia using clinical data and polysomnography. Methods: Latent profile analysis was used to identify symptom cluster profiles of 175 individuals (63% female) with insomnia disorder based on total scores on validated self-report instruments of daytime and nighttime symptoms (Insomnia Severity Index, Glasgow Sleep Effort Scale, Fatigue Severity Scale, Beliefs and Attitudes about Sleep, Epworth Sleepiness Scale, Pre-Sleep Arousal Scale), mean values from a 7-day sleep diary (sleep onset latency, wake after sleep onset, and sleep efficiency), and total sleep time derived from an in-laboratory PSG. Results: The best-fitting model had three symptom cluster profiles: “High Subjective Wakefulness” (HSW), “Mild Insomnia” (MI) and “Insomnia-Related Distress” (IRD). The HSW symptom cluster profile (26.3% of the sample) reported high wake after sleep onset, high sleep onset latency, and low sleep efficiency. Despite relatively comparable PSG-derived total sleep time, they reported greater levels of daytime sleepiness. The MI symptom cluster profile (45.1%) reported the least disturbance in the sleep diary and questionnaires and had the highest sleep efficiency. The IRD symptom cluster profile (28.6%) reported the highest mean scores on the insomnia-related distress measures (eg, sleep effort and arousal) and waking correlates (fatigue). Covariates associated with symptom cluster membership were older age for the HSW profile, greater obstructive sleep apnea severity for the MI profile, and, when adjusting for obstructive sleep apnea severity, being overweight/obese for the IRD profile. Conclusions: The heterogeneous nature of insomnia disorder is captured by this data-driven approach to identify symptom cluster profiles. The adaptation of a symptom cluster-based approach could guide tailored patient-centered management of patients presenting with insomnia, and enhance patient care. Citation: Crawford MR, Chirinos DA, Iurcotta T, Edinger JD, Wyatt JK, Manber R, Ong JC. Characterization of patients who present with insomnia: is there room for a symptom cluster-based approach? J Clin Sleep Med. 2017;13(7):911–921. PMID:28633722
An algorithm to diagnose ball bearing faults in servomotors running arbitrary motion profiles
NASA Astrophysics Data System (ADS)
Cocconcelli, Marco; Bassi, Luca; Secchi, Cristian; Fantuzzi, Cesare; Rubini, Riccardo
2012-02-01
This paper describes a procedure to extend the scope of classical methods to detect ball bearing faults (based on envelope analysis and fault frequencies identification) beyond their usual area of application. The objective of this procedure is to allow condition-based monitoring of such bearings in servomotor applications, where typically the motor in its normal mode of operation has to follow a non-constant angular velocity profile that may contain motion inversions. After describing and analyzing the algorithm from a theoretical point of view, experimental results obtained on a real industrial application are presented and commented.
RF Pulse Design using Nonlinear Gradient Magnetic Fields
Kopanoglu, Emre; Constable, R. Todd
2014-01-01
Purpose An iterative k-space trajectory and radio-frequency (RF) pulse design method is proposed for Excitation using Nonlinear Gradient Magnetic fields (ENiGMa). Theory and Methods The spatial encoding functions (SEFs) generated by nonlinear gradient fields (NLGFs) are linearly dependent in Cartesian-coordinates. Left uncorrected, this may lead to flip-angle variations in excitation profiles. In the proposed method, SEFs (k-space samples) are selected using a Matching-Pursuit algorithm, and the RF pulse is designed using a Conjugate-Gradient algorithm. Three variants of the proposed approach are given: the full-algorithm, a computationally-cheaper version, and a third version for designing spoke-based trajectories. The method is demonstrated for various target excitation profiles using simulations and phantom experiments. Results The method is compared to other iterative (Matching-Pursuit and Conjugate Gradient) and non-iterative (coordinate-transformation and Jacobian-based) pulse design methods as well as uniform density spiral and EPI trajectories. The results show that the proposed method can increase excitation fidelity significantly. Conclusion An iterative method for designing k-space trajectories and RF pulses using nonlinear gradient fields is proposed. The method can either be used for selecting the SEFs individually to guide trajectory design, or can be adapted to design and optimize specific trajectories of interest. PMID:25203286
NASA Astrophysics Data System (ADS)
Gu, Myojeong; Enell, Carl-Fredrik; Hendrick, François; Pukite, Janis; Van Roozendael, Michel; Platt, Ulrich; Raffalski, Uwe; Wagner, Thomas
2014-05-01
Stratospheric NO2 destroys ozone and acts as a buffer against halogen-catalyzed ozone loss through the formation of reservoir species (ClONO2, BrONO2). Since the importance of both mechanisms depends on the altitude, the investigation of stratospheric NO2 vertical distribution can provide more insight into the role of nitrogen compounds in the destruction of ozone. Here we present stratospheric NO2 vertical profiles retrieved from twilight ground-based zenith-sky DOAS observations at Kiruna, Sweden (68.84°N, 20.41°E) covering 1997 - 2013 periods. This instrument observes zenith scattered sunlight. The sensitivity for stratospheric trace gases is highest during twilight due to the maximum altitude of the scattering profile and the light path through the stratosphere, which vary with the solar zenith angle. The profiling algorithm, based on the Optimal Estimation Method, has been developed by IASB-BIRA and successfully applied at other stations (Hendrick et al., 2004). The basic principle behind this profiling approach is that during twilight, the mean Rayleigh scattering altitude scans the stratosphere rapidly, providing height-resolved information on the absorption by stratospheric NO2. In this study, the long-term evolution of the stratospheric NO2 profile at polar latitude will be investigated. Hendrick, F., B. Barret, M. Van Roozendael, H. Boesch, A. Butz, M. De Mazière, F. Goutail, C. Hermans, J.-C. Lambert, K. Pfeilsticker, and J.-P. Pommereau, Retrieval of nitrogen dioxide stratospheric profiles from ground-based zenith-sky UV-visible observations: Validation of the technique through correlative comparisons, Atmospheric Chemistry and Physics, 4, 2091-2106, 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krčo, Marko; Goldsmith, Paul F., E-mail: marko@astro.cornell.edu
2016-05-01
We present a geometry-independent method for determining the shapes of radial volume density profiles of astronomical objects whose geometries are unknown, based on a single column density map. Such profiles are often critical to understand the physics and chemistry of molecular cloud cores, in which star formation takes place. The method presented here does not assume any geometry for the object being studied, thus removing a significant source of bias. Instead, it exploits contour self-similarity in column density maps, which appears to be common in data for astronomical objects. Our method may be applied to many types of astronomical objectsmore » and observable quantities so long as they satisfy a limited set of conditions, which we describe in detail. We derive the method analytically, test it numerically, and illustrate its utility using 2MASS-derived dust extinction in molecular cloud cores. While not having made an extensive comparison of different density profiles, we find that the overall radial density distribution within molecular cloud cores is adequately described by an attenuated power law.« less
Application of Classification Methods for Forecasting Mid-Term Power Load Patterns
NASA Astrophysics Data System (ADS)
Piao, Minghao; Lee, Heon Gyu; Park, Jin Hyoung; Ryu, Keun Ho
Currently an automated methodology based on data mining techniques is presented for the prediction of customer load patterns in long duration load profiles. The proposed approach in this paper consists of three stages: (i) data preprocessing: noise or outlier is removed and the continuous attribute-valued features are transformed to discrete values, (ii) cluster analysis: k-means clustering is used to create load pattern classes and the representative load profiles for each class and (iii) classification: we evaluated several supervised learning methods in order to select a suitable prediction method. According to the proposed methodology, power load measured from AMR (automatic meter reading) system, as well as customer indexes, were used as inputs for clustering. The output of clustering was the classification of representative load profiles (or classes). In order to evaluate the result of forecasting load patterns, the several classification methods were applied on a set of high voltage customers of the Korea power system and derived class labels from clustering and other features are used as input to produce classifiers. Lastly, the result of our experiments was presented.
Ruffner, J.A.
1999-06-15
A method for coating (flat or non-flat) optical substrates with high-reflectivity multi-layer coatings for use at Deep Ultra-Violet (DUV) and Extreme Ultra-Violet (EUV) wavelengths. The method results in a product with minimum feature sizes of less than 0.10 [micro]m for the shortest wavelength (13.4 nm). The present invention employs a computer-based modeling and deposition method to enable lateral and vertical thickness control by scanning the position of the substrate with respect to the sputter target during deposition. The thickness profile of the sputter targets is modeled before deposition and then an appropriate scanning algorithm is implemented to produce any desired, radially-symmetric thickness profile. The present invention offers the ability to predict and achieve a wide range of thickness profiles on flat or figured substrates, i.e., account for 1/R[sup 2] factor in a model, and the ability to predict and accommodate changes in deposition rate as a result of plasma geometry, i.e., over figured substrates. 15 figs.
NASA Technical Reports Server (NTRS)
Chapman, Dean R
1952-01-01
A theoretical investigation is made of the airfoil profile for minimum pressure drag at zero lift in supersonic flow. In the first part of the report a general method is developed for calculating the profile having the least pressure drag for a given auxiliary condition, such as a given structural requirement or a given thickness ratio. The various structural requirements considered include bending strength, bending stiffness, torsional strength, and torsional stiffness. No assumption is made regarding the trailing-edge thickness; the optimum value is determined in the calculations as a function of the base pressure. To illustrate the general method, the optimum airfoil, defined as the airfoil having minimum pressure drag for a given auxiliary condition, is calculated in a second part of the report using the equations of linearized supersonic flow.
Multiple Interests of Users in Collaborative Tagging Systems
NASA Astrophysics Data System (ADS)
Au Yeung, Ching-Man; Gibbins, Nicholas; Shadbolt, Nigel
Performance of recommender systems depends on whether the user profiles contain accurate information about the interests of the users, and this in turn relies on whether enough information about their interests can be collected. Collaborative tagging systems allow users to use their own words to describe their favourite resources, resulting in some user-generated categorisation schemes commonly known as folksonomies. Folksonomies thus contain rich information about the interests of the users, which can be used to support various recommender systems. Our analysis of the folksonomy in Delicious reveals that the interests of a single user can be very diverse. Traditional methods for representing interests of users are usually not able to reflect such diversity. We propose a method to construct user profiles of multiple interests from folksonomies based on a network clustering technique. Our evaluation shows that the proposed method is able to generate user profiles which reflect the diversity of user interests and can be used as a basis of providing more focused recommendation to the users.
Aprea, Eugenio; Gika, Helen; Carlin, Silvia; Theodoridis, Georgios; Vrhovsek, Urska; Mattivi, Fulvio
2011-07-15
A headspace SPME GC-TOF-MS method was developed for the acquisition of metabolite profiles of apple volatiles. As a first step, an experimental design was applied to find out the most appropriate conditions for the extraction of apple volatile compounds by SPME. The selected SPME method was applied in profiling of four different apple varieties by GC-EI-TOF-MS. Full scan GC-MS data were processed by MarkerLynx software for peak picking, normalisation, alignment and feature extraction. Advanced chemometric/statistical techniques (PCA and PLS-DA) were used to explore data and extract useful information. Characteristic markers of each variety were successively identified using the NIST library thus providing useful information for variety classification. The developed HS-SPME sampling method is fully automated and proved useful in obtaining the fingerprint of the volatile content of the fruit. The described analytical protocol can aid in further studies of the apple metabolome. Copyright © 2011 Elsevier B.V. All rights reserved.
Protein profiling in potato (Solanum tuberosum L.) leaf tissues by differential centrifugation.
Lim, Sanghyun; Chisholm, Kenneth; Coffin, Robert H; Peters, Rick D; Al-Mughrabi, Khalil I; Wang-Pruski, Gefu; Pinto, Devanand M
2012-04-06
Foliar diseases, such as late blight, result in serious threats to potato production. As such, potato leaf tissue becomes an important substrate to study biological processes, such as plant defense responses to infection. Nonetheless, the potato leaf proteome remains poorly characterized. Here, we report protein profiling of potato leaf tissues using a modified differential centrifugation approach to separate the leaf tissues into cell wall and cytoplasmic fractions. This method helps to increase the number of identified proteins, including targeted putative cell wall proteins. The method allowed for the identification of 1484 nonredundant potato leaf proteins, of which 364 and 447 were reproducibly identified proteins in the cell wall and cytoplasmic fractions, respectively. Reproducibly identified proteins corresponded to over 70% of proteins identified in each replicate. A diverse range of proteins was identified based on their theoretical pI values, molecular masses, functional classification, and biological processes. Such a protein extraction method is effective for the establishment of a highly qualified proteome profile.
Bhowmick, P P; Khushiramani, R; Raghunath, P; Karunasagar, I; Karunasagar, I
2008-02-01
Evaluation of protein profiling for typing Vibrio parahaemolyticus using 71 strains isolated from different seafood and comparison with other molecular typing techniques such as random amplified polymorphic DNA analysis (RAPD) and enterobacterial repetitive intergenic consensus sequence (ERIC)-PCR. Three molecular typing methods were used for the typing of 71 V. parahaemolyticus isolates from seafood. RAPD had a discriminatory index (DI) of 0.95, while ERIC-PCR showed a DI of 0.94. Though protein profiling had less discriminatory power, use of this method can be helpful in identifying new proteins which might have a role in establishment in the host or virulence of the organism. The use of protein profiling in combination with other established typing methods such as RAPD and ERIC-PCR generates useful information in the case of V. parahaemolyticus associated with seafood. The study demonstrates the usefulness of nucleic acid and protein-based studies in understanding the relationship between various isolates from seafood.
Wingert, Nathalie R; Dos Santos, Natália O; Campanharo, Sarah C; Simon, Elisa S; Volpato, Nadia M; Steppe, Martin
2018-05-01
This study aimed to develop and validate an in vitro dissolution method based on in silico-in vivo data to determine whether an in vitro-in vivo relationship could be established for rivaroxaban in immediate-release tablets. Oral drugs with high permeability but poorly soluble in aqueous media, such as the anticoagulant rivaroxaban, have a major potential to reach a high level of in vitro-in vivo relationship. Currently, there is no study on scientific literature approaching the development of RIV dissolution profile based on its in vivo performance. Drug plasma concentration values were modeled using computer simulation with adjustment of pharmacokinetic properties. Those values were converted into drug fractions absorbed by the Wagner-Nelson deconvolution approach. Gradual and continuous dissolution of RIV tablets was obtained with a 30 rpm basket on 50 mM sodium acetate +0.2% SDS, pH 6.5 medium. Dissolution was conducted for up to 180 min. The fraction absorbed was plotted against the drug fraction dissolved, and a linear point-to-point regression (R 2 = 0.9961) obtained. The in vitro dissolution method designed promoted a more convenient dissolution profile of RIV tablets, whereas it suggests a better relationship with in vivo performance.
Al Asmari, Abdulrahman; Manthiri, Rajamohammed Abbas; Khan, Haseeb Ahmad
2014-11-01
Identification of snake species is important for various reasons including the emergency treatment of snake bite victims. We present a simple method for identification of six snake species using the gel filtration chromatographic profiles of their venoms. The venoms of Echis coloratus, Echis pyramidum, Cerastes gasperettii, Bitis arietans, Naja arabica, and Walterinnesia aegyptia were milked, lyophilized, diluted and centrifuged to separate the mucus from the venom. The clear supernatants were filtered and chromatographed on fast protein liquid chromatography (FPLC). We obtained the 16S rRNA gene sequences of the above species and performed phylogenetic analysis using the neighbor-joining method. The chromatograms of venoms from different snake species showed peculiar patterns based on the number and location of peaks. The dendrograms generated from similarity matrix based on the presence/absence of particular chromatographic peaks clearly differentiated Elapids from Viperids. Molecular cladistics using 16S rRNA gene sequences resulted in jumping clades while separating the members of these two families. These findings suggest that chromatographic profiles of snake venoms may provide a simple and reproducible chemical fingerprinting method for quick identification of snake species. However, the validation of this methodology requires further studies on large number of specimens from within and across species.
Yukinawa, Naoto; Oba, Shigeyuki; Kato, Kikuya; Ishii, Shin
2009-01-01
Multiclass classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. There have been many studies of aggregating binary classifiers to construct a multiclass classifier based on one-versus-the-rest (1R), one-versus-one (11), or other coding strategies, as well as some comparison studies between them. However, the studies found that the best coding depends on each situation. Therefore, a new problem, which we call the "optimal coding problem," has arisen: how can we determine which coding is the optimal one in each situation? To approach this optimal coding problem, we propose a novel framework for constructing a multiclass classifier, in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. Although there is no a priori answer to the optimal coding problem, our weight tuning method can be a consistent answer to the problem. We apply this method to various classification problems including a synthesized data set and some cancer diagnosis data sets from gene expression profiling. The results demonstrate that, in most situations, our method can improve classification accuracy over simple voting heuristics and is better than or comparable to state-of-the-art multiclass predictors.
Muscle activation patterns in acceleration-based phases during reach-to-grasp movement.
Tokuda, Keisuke; Lee, Bumsuk; Shiihara, Yasufumi; Takahashi, Kazuhiro; Wada, Naoki; Shirakura, Kenji; Watanabe, Hideomi
2016-11-01
[Purpose] An earlier study divided reaching activity into characteristic phases based on hand velocity profiles. By synchronizing muscle activities and the acceleration profile, a phasing approach for reaching movement, based on hand acceleration profiles, was attempted in order to elucidate the roles of individual muscle activities in the different phases of the acceleration profile in reaching movements. [Subjects and Methods] Ten healthy volunteer subjects participated in this study. The aim was to electromyographically evaluate muscles around the shoulder, the upper trapezius, the anterior deltoid, the biceps brachii, and the triceps brachii, most of which have been used to evaluate arm motion, as well as the acceleration of the upper limb during simple reaching movement in the reach-to-grasp task. [Results] Analysis showed the kinematic trajectories of the acceleration during a simple biphasic profile of the reaching movement could be divided into four phases: increasing acceleration (IA), decreasing acceleration (DA), increasing deceleration (ID), and decreasing deceleration (DD). Muscles around the shoulder showed different activity patterns, which were closely associated with these acceleration phases. [Conclusion] These results suggest the important role of the four phases, derived from the acceleration trajectory, in the elucidation of the muscular mechanisms which regulate and coordinate the muscles around the shoulder in reaching movements.
Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian
2012-10-24
Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.
Determination of wall shear stress from mean velocity and Reynolds shear stress profiles
NASA Astrophysics Data System (ADS)
Volino, Ralph J.; Schultz, Michael P.
2018-03-01
An analytical method is presented for determining the Reynolds shear stress profile in steady, two-dimensional wall-bounded flows using the mean streamwise velocity. The method is then utilized with experimental data to determine the local wall shear stress. The procedure is applicable to flows on smooth and rough surfaces with arbitrary pressure gradients. It is based on the streamwise component of the boundary layer momentum equation, which is transformed into inner coordinates. The method requires velocity profiles from at least two streamwise locations, but the formulation of the momentum equation reduces the dependence on streamwise gradients. The method is verified through application to laminar flow solutions and turbulent DNS results from both zero and nonzero pressure gradient boundary layers. With strong favorable pressure gradients, the method is shown to be accurate for finding the wall shear stress in cases where the Clauser fit technique loses accuracy. The method is then applied to experimental data from the literature from zero pressure gradient studies on smooth and rough walls, and favorable and adverse pressure gradient cases on smooth walls. Data from very near the wall are not required for determination of the wall shear stress. Wall friction velocities obtained using the present method agree with those determined in the original studies, typically to within 2%.
NASA Astrophysics Data System (ADS)
Zhou, Yunfei; Cai, Hongzhi; Zhong, Liyun; Qiu, Xiang; Tian, Jindong; Lu, Xiaoxu
2017-05-01
In white light scanning interferometry (WLSI), the accuracy of profile measurement achieved with the conventional zero optical path difference (ZOPD) position locating method is closely related with the shape of interference signal envelope (ISE), which is mainly decided by the spectral distribution of illumination source. For a broadband light with Gaussian spectral distribution, the corresponding shape of ISE reveals a symmetric distribution, so the accurate ZOPD position can be achieved easily. However, if the spectral distribution of source is irregular, the shape of ISE will become asymmetric or complex multi-peak distribution, WLSI cannot work well through using ZOPD position locating method. Aiming at this problem, we propose time-delay estimation (TDE) based WLSI method, in which the surface profile information is achieved by using the relative displacement of interference signal between different pixels instead of the conventional ZOPD position locating method. Due to all spectral information of interference signal (envelope and phase) are utilized, in addition to revealing the advantage of high accuracy, the proposed method can achieve profile measurement with high accuracy in the case that the shape of ISE is irregular while ZOPD position locating method cannot work. That is to say, the proposed method can effectively eliminate the influence of source spectrum.
Integration of mask and silicon metrology in DFM
NASA Astrophysics Data System (ADS)
Matsuoka, Ryoichi; Mito, Hiroaki; Sugiyama, Akiyuki; Toyoda, Yasutaka
2009-03-01
We have developed a highly integrated method of mask and silicon metrology. The method adopts a metrology management system based on DBM (Design Based Metrology). This is the high accurate contouring created by an edge detection algorithm used in mask CD-SEM and silicon CD-SEM. We have inspected the high accuracy, stability and reproducibility in the experiments of integration. The accuracy is comparable with that of the mask and silicon CD-SEM metrology. In this report, we introduce the experimental results and the application. As shrinkage of design rule for semiconductor device advances, OPC (Optical Proximity Correction) goes aggressively dense in RET (Resolution Enhancement Technology). However, from the view point of DFM (Design for Manufacturability), the cost of data process for advanced MDP (Mask Data Preparation) and mask producing is a problem. Such trade-off between RET and mask producing is a big issue in semiconductor market especially in mask business. Seeing silicon device production process, information sharing is not completely organized between design section and production section. Design data created with OPC and MDP should be linked to process control on production. But design data and process control data are optimized independently. Thus, we provided a solution of DFM: advanced integration of mask metrology and silicon metrology. The system we propose here is composed of followings. 1) Design based recipe creation: Specify patterns on the design data for metrology. This step is fully automated since they are interfaced with hot spot coordinate information detected by various verification methods. 2) Design based image acquisition: Acquire the images of mask and silicon automatically by a recipe based on the pattern design of CD-SEM.It is a robust automated step because a wide range of design data is used for the image acquisition. 3) Contour profiling and GDS data generation: An image profiling process is applied to the acquired image based on the profiling method of the field proven CD metrology algorithm. The detected edges are then converted to GDSII format, which is a standard format for a design data, and utilized for various DFM systems such as simulation. Namely, by integrating pattern shapes of mask and silicon formed during a manufacturing process into GDSII format, it makes it possible to bridge highly accurate pattern profile information over to the design field of various EDA systems. These are fully integrated into design data and automated. Bi-directional cross probing between mask data and process control data is allowed by linking them. This method is a solution for total optimization that covers Design, MDP, mask production and silicon device producing. This method therefore is regarded as a strategic DFM approach in the semiconductor metrology.
Gabor-based kernel PCA with fractional power polynomial models for face recognition.
Liu, Chengjun
2004-05-01
This paper presents a novel Gabor-based kernel Principal Component Analysis (PCA) method by integrating the Gabor wavelet representation of face images and the kernel PCA method for face recognition. Gabor wavelets first derive desirable facial features characterized by spatial frequency, spatial locality, and orientation selectivity to cope with the variations due to illumination and facial expression changes. The kernel PCA method is then extended to include fractional power polynomial models for enhanced face recognition performance. A fractional power polynomial, however, does not necessarily define a kernel function, as it might not define a positive semidefinite Gram matrix. Note that the sigmoid kernels, one of the three classes of widely used kernel functions (polynomial kernels, Gaussian kernels, and sigmoid kernels), do not actually define a positive semidefinite Gram matrix either. Nevertheless, the sigmoid kernels have been successfully used in practice, such as in building support vector machines. In order to derive real kernel PCA features, we apply only those kernel PCA eigenvectors that are associated with positive eigenvalues. The feasibility of the Gabor-based kernel PCA method with fractional power polynomial models has been successfully tested on both frontal and pose-angled face recognition, using two data sets from the FERET database and the CMU PIE database, respectively. The FERET data set contains 600 frontal face images of 200 subjects, while the PIE data set consists of 680 images across five poses (left and right profiles, left and right half profiles, and frontal view) with two different facial expressions (neutral and smiling) of 68 subjects. The effectiveness of the Gabor-based kernel PCA method with fractional power polynomial models is shown in terms of both absolute performance indices and comparative performance against the PCA method, the kernel PCA method with polynomial kernels, the kernel PCA method with fractional power polynomial models, the Gabor wavelet-based PCA method, and the Gabor wavelet-based kernel PCA method with polynomial kernels.
System and method for charging a plug-in electric vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bassham, Marjorie A.; Spigno, Jr., Ciro A.; Muller, Brett T.
2017-05-02
A charging system and method that may be used to automatically apply customized charging settings to a plug-in electric vehicle, where application of the settings is based on the vehicle's location. According to an exemplary embodiment, a user may establish and save a separate charging profile with certain customized charging settings for each geographic location where they plan to charge their plug-in electric vehicle. Whenever the plug-in electric vehicle enters a new geographic area, the charging method may automatically apply the charging profile that corresponds to that area. Thus, the user does not have to manually change or manipulate themore » charging settings every time they charge the plug-in electric vehicle in a new location.« less
Local atomic and electronic structure of oxide/GaAs and SiO2/Si interfaces using high-resolution XPS
NASA Technical Reports Server (NTRS)
Grunthaner, F. J.; Grunthaner, P. J.; Vasquez, R. P.; Lewis, B. F.; Maserjian, J.; Madhukar, A.
1979-01-01
The chemical structures of thin SiO2 films, thin native oxides of GaAs (20-30 A), and the respective oxide-semiconductor interfaces, have been investigated using high-resolution X-ray photoelectron spectroscopy. Depth profiles of these structures have been obtained using argon ion bombardment and wet chemical etching techniques. The chemical destruction induced by the ion profiling method is shown by direct comparison of these methods for identical samples. Fourier transform data-reduction methods based on linear prediction with maximum entropy constraints are used to analyze the discrete structure in oxides and substrates. This discrete structure is interpreted by means of a structure-induced charge-transfer model.
Niccoli Asabella, A; Antonica, F; Renna, M A; Rubini, D; Notaristefano, A; Nicoletti, A; Rubini, G
2013-12-01
To develop a method to fuse lymphoscintigraphic images with an adaptable anatomical vector profile and to evaluate its role in the clinical practice. We used Adobe Illustrator CS6 to create different vector profiles, we fused those profiles, using Adobe Photoshop CS6, with lymphoscintigraphic images of the patient. We processed 197 lymphoscintigraphies performed in patients with cutaneous melanomas, breast cancer or delayed lymph drainage. Our models can be adapted to every patient attitude or position and contain different levels of anatomical details ranging from external body profiles to the internal anatomical structures like bones, muscles, vessels, and lymph nodes. If needed, more new anatomical details can be added and embedded in the profile without redrawing them, saving a lot of time. Details can also be easily hidden, allowing the physician to view only relevant information and structures. Fusion times are about 85 s. The diagnostic confidence of the observers increased significantly. The validation process showed a slight shift (mean 4.9 mm). We have created a new, practical, inexpensive digital technique based on commercial software for fusing lymphoscintigraphic images with built-in anatomical reference profiles. It is easily reproducible and does not alter the original scintigraphic image. Our method allows a more meaningful interpretation of lymphoscintigraphies, an easier recognition of the anatomical site and better lymph node dissection planning.
NASA Astrophysics Data System (ADS)
Cui, Ning; Liang, Renrong; Wang, Jing; Xu, Jun
2012-06-01
Choosing novel materials and structures is important for enhancing the on-state current in tunnel field-effect transistors (TFETs). In this paper, we reveal that the on-state performance of TFETs is mainly determined by the energy band profile of the channel. According to this interpretation, we present a new concept of energy band profile modulation (BPM) achieved with gate structure engineering. It is believed that this approach can be used to suppress the ambipolar effect. Based on this method, a Si TFET device with a symmetrical tri-material-gate (TMG) structure is proposed. Two-dimensional numerical simulations demonstrated that the special band profile in this device can boost on-state performance, and it also suppresses the off-state current induced by the ambipolar effect. These unique advantages are maintained over a wide range of gate lengths and supply voltages. The BPM concept can serve as a guideline for improving the performance of nanoscale TFET devices.
Hall, Judith A; Back, Mitja D; Nestler, Steffen; Frauendorfer, Denise; Schmid Mast, Marianne; Ruben, Mollie A
2018-04-01
This research compares two different approaches that are commonly used to measure accuracy of personality judgment: the trait accuracy approach wherein participants discriminate among targets on a given trait, thus making intertarget comparisons, and the profile accuracy approach wherein participants discriminate between traits for a given target, thus making intratarget comparisons. We examined correlations between these methods as well as correlations among accuracies for judging specific traits. The present article documents relations among these approaches based on meta-analysis of five studies of zero-acquaintance impressions of the Big Five traits. Trait accuracies correlated only weakly with overall and normative profile accuracy. Substantial convergence between the trait and profile accuracy methods was only found when an aggregate of all five trait accuracies was correlated with distinctive profile accuracy. Importantly, however, correlations between the trait and profile accuracy approaches were reduced to negligibility when statistical overlap was corrected by removing the respective trait from the profile correlations. Moreover, correlations of the separate trait accuracies with each other were very weak. Different ways of measuring individual differences in personality judgment accuracy are not conceptually and empirically the same, but rather represent distinct abilities that rely on different judgment processes. © 2017 Wiley Periodicals, Inc.
Distributed Method to Optimal Profile Descent
NASA Astrophysics Data System (ADS)
Kim, Geun I.
Current ground automation tools for Optimal Profile Descent (OPD) procedures utilize path stretching and speed profile change to maintain proper merging and spacing requirements at high traffic terminal area. However, low predictability of aircraft's vertical profile and path deviation during decent add uncertainty to computing estimated time of arrival, a key information that enables the ground control center to manage airspace traffic effectively. This paper uses an OPD procedure that is based on a constant flight path angle to increase the predictability of the vertical profile and defines an OPD optimization problem that uses both path stretching and speed profile change while largely maintaining the original OPD procedure. This problem minimizes the cumulative cost of performing OPD procedures for a group of aircraft by assigning a time cost function to each aircraft and a separation cost function to a pair of aircraft. The OPD optimization problem is then solved in a decentralized manner using dual decomposition techniques under inter-aircraft ADS-B mechanism. This method divides the optimization problem into more manageable sub-problems which are then distributed to the group of aircraft. Each aircraft solves its assigned sub-problem and communicate the solutions to other aircraft in an iterative process until an optimal solution is achieved thus decentralizing the computation of the optimization problem.
Timm, Collin M; Lloyd, Evan P; Egan, Amanda; Mariner, Ray; Karig, David
2018-01-01
Bacterially produced volatile organic compounds (VOCs) can modify growth patterns of eukaryotic hosts and competing/cohabiting microbes. These compounds have been implicated in skin disorders and attraction of biting pests. Current methods to detect and characterize VOCs from microbial cultures can be laborious and low-throughput, making it difficult to understand the behavior of microbial populations. In this work we present an efficient method employing gas chromatography/mass spectrometry with autosampling to characterize VOC profiles from solid-phase bacterial cultures. We compare this method to complementary plate-based assays and measure the effects of growth media and incubation temperature on the VOC profiles from a well-studied Pseudomonas aeruginosa PAO1 system. We observe that P. aeruginosa produces longer chain VOCs, such as 2-undecanone and 2-undecanol in higher amounts at 37°C than 30°C. We demonstrate the throughput of this method by studying VOC profiles from a representative collection of skin bacterial isolates under three parallel growth conditions. We observe differential production of various aldehydes and ketones depending on bacterial strain. This generalizable method will support screening of bacterial populations in a variety of research areas.
Gao, Xiaoli; Zhang, Qibin; Meng, Da; Issac, Giorgis; Zhao, Rui; Fillmore, Thomas L.; Chu, Rosey K.; Zhou, Jianying; Tang, Keqi; Hu, Zeping; Moore, Ronald J.; Smith, Richard D.; Katze, Michael G.; Metz, Thomas O.
2012-01-01
Lipidomics is a critical part of metabolomics and aims to study all the lipids within a living system. We present here the development and evaluation of a sensitive capillary UPLC-MS method for comprehensive top-down/bottom-up lipid profiling. Three different stationary phases were evaluated in terms of peak capacity, linearity, reproducibility, and limit of quantification (LOQ) using a mixture of lipid standards representative of the lipidome. The relative standard deviations of the retention times and peak abundances of the lipid standards were 0.29% and 7.7%, respectively, when using the optimized method. The linearity was acceptable at >0.99 over 3 orders of magnitude, and the LOQs were sub-fmol. To demonstrate the performance of the method in the analysis of complex samples, we analyzed lipids extracted from a human cell line, rat plasma, and a model human skin tissue, identifying 446, 444, and 370 unique lipids, respectively. Overall, the method provided either higher coverage of the lipidome, greater measurement sensitivity, or both, when compared to other approaches of global, untargeted lipid profiling based on chromatography coupled with MS. PMID:22354571
Timm, Collin M.; Lloyd, Evan P.; Egan, Amanda; Mariner, Ray; Karig, David
2018-01-01
Bacterially produced volatile organic compounds (VOCs) can modify growth patterns of eukaryotic hosts and competing/cohabiting microbes. These compounds have been implicated in skin disorders and attraction of biting pests. Current methods to detect and characterize VOCs from microbial cultures can be laborious and low-throughput, making it difficult to understand the behavior of microbial populations. In this work we present an efficient method employing gas chromatography/mass spectrometry with autosampling to characterize VOC profiles from solid-phase bacterial cultures. We compare this method to complementary plate-based assays and measure the effects of growth media and incubation temperature on the VOC profiles from a well-studied Pseudomonas aeruginosa PAO1 system. We observe that P. aeruginosa produces longer chain VOCs, such as 2-undecanone and 2-undecanol in higher amounts at 37°C than 30°C. We demonstrate the throughput of this method by studying VOC profiles from a representative collection of skin bacterial isolates under three parallel growth conditions. We observe differential production of various aldehydes and ketones depending on bacterial strain. This generalizable method will support screening of bacterial populations in a variety of research areas. PMID:29662472
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro
2007-05-01
Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent's non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent's method is not suitable for ROKU.
NASA Astrophysics Data System (ADS)
Massaro, G.; Stiperski, I.; Pospichal, B.; Rotach, M. W.
2015-03-01
Within the Innsbruck Box project, a ground-based microwave radiometer (RPG-HATPRO) was operated in the Inn Valley (Austria), in very complex terrain, between September 2012 and May 2013 to obtain temperature and humidity vertical profiles of the full troposphere with a specific focus on the valley boundary layer. The profiles obtained by the radiometer with different retrieval algorithms based on different climatologies, are compared to local radiosonde data. A retrieval that is improved with respect to the one provided by the manufacturer, based on better resolved data, shows a significantly smaller root mean square error (RMSE), both for the temperature and humidity profiles. The improvement is particularly substantial at the heights close to the mountaintop level and in the upper troposphere. Lower level inversions, common in an alpine valley, are resolved to a satisfactory degree. On the other hand, upper level inversions (above 1200 m) still pose a significant challenge for retrieval. For this purpose, specialized retrieval algorithms were developed by classifying the radiosonde climatologies into specialized categories according to different criteria (seasons, daytime, nighttime) and using additional regressors (e.g., measurements from mountain stations). The training and testing on the radiosonde data for these specialized categories suggests that a classification of profiles that reproduces meaningful physical characteristics can yield improved targeted specialized retrievals. A really new and very promising method of improving the profile retrieval in a mountain region is adding further information in the retrieval, such as the surface temperature at fixed levels along a topographic slope or from nearby mountain tops.
Binary similarity measures for fingerprint analysis of qualitative metabolomic profiles.
Rácz, Anita; Andrić, Filip; Bajusz, Dávid; Héberger, Károly
2018-01-01
Contemporary metabolomic fingerprinting is based on multiple spectrometric and chromatographic signals, used either alone or combined with structural and chemical information of metabolic markers at the qualitative and semiquantitative level. However, signal shifting, convolution, and matrix effects may compromise metabolomic patterns. Recent increase in the use of qualitative metabolomic data, described by the presence (1) or absence (0) of particular metabolites, demonstrates great potential in the field of metabolomic profiling and fingerprint analysis. The aim of this study is a comprehensive evaluation of binary similarity measures for the elucidation of patterns among samples of different botanical origin and various metabolomic profiles. Nine qualitative metabolomic data sets covering a wide range of natural products and metabolomic profiles were applied to assess 44 binary similarity measures for the fingerprinting of plant extracts and natural products. The measures were analyzed by the novel sum of ranking differences method (SRD), searching for the most promising candidates. Baroni-Urbani-Buser (BUB) and Hawkins-Dotson (HD) similarity coefficients were selected as the best measures by SRD and analysis of variance (ANOVA), while Dice (Di1), Yule, Russel-Rao, and Consonni-Todeschini 3 ranked the worst. ANOVA revealed that concordantly and intermediately symmetric similarity coefficients are better candidates for metabolomic fingerprinting than the asymmetric and correlation based ones. The fingerprint analysis based on the BUB and HD coefficients and qualitative metabolomic data performed equally well as the quantitative metabolomic profile analysis. Fingerprint analysis based on the qualitative metabolomic profiles and binary similarity measures proved to be a reliable way in finding the same/similar patterns in metabolomic data as that extracted from quantitative data.
Fricano, Meagan M; Ditewig, Amy C; Jung, Paul M; Liguori, Michael J; Blomme, Eric A G; Yang, Yi
2011-01-01
Blood is an ideal tissue for the identification of novel genomic biomarkers for toxicity or efficacy. However, using blood for transcriptomic profiling presents significant technical challenges due to the transcriptomic changes induced by ex vivo handling and the interference of highly abundant globin mRNA. Most whole blood RNA stabilization and isolation methods also require significant volumes of blood, limiting their effective use in small animal species, such as rodents. To overcome these challenges, a QIAzol-based RNA stabilization and isolation method (QSI) was developed to isolate sufficient amounts of high quality total RNA from 25 to 500 μL of rat whole blood. The method was compared to the standard PAXgene Blood RNA System using blood collected from rats exposed to saline or lipopolysaccharide (LPS). The QSI method yielded an average of 54 ng total RNA per μL of rat whole blood with an average RNA Integrity Number (RIN) of 9, a performance comparable with the standard PAXgene method. Total RNA samples were further processed using the NuGEN Ovation Whole Blood Solution system and cDNA was hybridized to Affymetrix Rat Genome 230 2.0 Arrays. The microarray QC parameters using RNA isolated with the QSI method were within the acceptable range for microarray analysis. The transcriptomic profiles were highly correlated with those using RNA isolated with the PAXgene method and were consistent with expected LPS-induced inflammatory responses. The present study demonstrated that the QSI method coupled with NuGEN Ovation Whole Blood Solution system is cost-effective and particularly suitable for transcriptomic profiling of minimal volumes of whole blood, typical of those obtained with small animal species.
Li, Yong; Ruan, Qiang; Li, Yanli; Ye, Guozhu; Lu, Xin; Lin, Xiaohui; Xu, Guowang
2012-09-14
Non-targeted metabolic profiling is the most widely used method for metabolomics. In this paper, a novel approach was established to transform a non-targeted metabolic profiling method to a pseudo-targeted method using the retention time locking gas chromatography/mass spectrometry-selected ion monitoring (RTL-GC/MS-SIM). To achieve this transformation, an algorithm based on the automated mass spectral deconvolution and identification system (AMDIS), GC/MS raw data and a bi-Gaussian chromatographic peak model was developed. The established GC/MS-SIM method was compared with GC/MS-full scan (the total ion current and extracted ion current, TIC and EIC) methods, it was found that for a typical tobacco leaf extract, 93% components had their relative standard deviations (RSDs) of relative peak areas less than 20% by the SIM method, while 88% by the EIC method and 81% by the TIC method. 47.3% components had their linear correlation coefficient higher than 0.99, compared with 5.0% by the EIC and 6.2% by TIC methods. Multivariate analysis showed the pooled quality control samples clustered more tightly using the developed method than using GC/MS-full scan methods, indicating a better data quality. With the analysis of the variance of the tobacco samples from three different planting regions, 167 differential components (p<0.05) were screened out using the RTL-GC/MS-SIM method, but 151 and 131 by the EIC and TIC methods, respectively. The results show that the developed method not only has a higher sensitivity, better linearity and data quality, but also does not need complicated peak alignment among different samples. It is especially suitable for the screening of differential components in the metabolic profiling investigation. Copyright © 2012 Elsevier B.V. All rights reserved.
Derivative component analysis for mass spectral serum proteomic profiles.
Han, Henry
2014-01-01
As a promising way to transform medicine, mass spectrometry based proteomics technologies have seen a great progress in identifying disease biomarkers for clinical diagnosis and prognosis. However, there is a lack of effective feature selection methods that are able to capture essential data behaviors to achieve clinical level disease diagnosis. Moreover, it faces a challenge from data reproducibility, which means that no two independent studies have been found to produce same proteomic patterns. Such reproducibility issue causes the identified biomarker patterns to lose repeatability and prevents it from real clinical usage. In this work, we propose a novel machine-learning algorithm: derivative component analysis (DCA) for high-dimensional mass spectral proteomic profiles. As an implicit feature selection algorithm, derivative component analysis examines input proteomics data in a multi-resolution approach by seeking its derivatives to capture latent data characteristics and conduct de-noising. We further demonstrate DCA's advantages in disease diagnosis by viewing input proteomics data as a profile biomarker via integrating it with support vector machines to tackle the reproducibility issue, besides comparing it with state-of-the-art peers. Our results show that high-dimensional proteomics data are actually linearly separable under proposed derivative component analysis (DCA). As a novel multi-resolution feature selection algorithm, DCA not only overcomes the weakness of the traditional methods in subtle data behavior discovery, but also suggests an effective resolution to overcoming proteomics data's reproducibility problem and provides new techniques and insights in translational bioinformatics and machine learning. The DCA-based profile biomarker diagnosis makes clinical level diagnostic performances reproducible across different proteomic data, which is more robust and systematic than the existing biomarker discovery based diagnosis. Our findings demonstrate the feasibility and power of the proposed DCA-based profile biomarker diagnosis in achieving high sensitivity and conquering the data reproducibility issue in serum proteomics. Furthermore, our proposed derivative component analysis suggests the subtle data characteristics gleaning and de-noising are essential in separating true signals from red herrings for high-dimensional proteomic profiles, which can be more important than the conventional feature selection or dimension reduction. In particular, our profile biomarker diagnosis can be generalized to other omics data for derivative component analysis (DCA)'s nature of generic data analysis.
Electronic properties of Laves phase ZrFe{sub 2} using Compton spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatt, Samir, E-mail: sameerbhatto11@gmail.com; Kumar, Kishor; Ahuja, B. L.
First-ever experimental Compton profile of Laves phase ZrFe{sub 2}, using indigenous 20 Ci {sup 137}Cs Compton spectrometer, is presented. To analyze the experimental electron momentum density, we have deduced the theoretical Compton profiles using density functional theory (DFT) and hybridization of DFT and Hartree-Fock scheme within linear combination of atomic orbitals (LCAO) method. The energy bands and density of states are also calculated using LCAO prescription. The theoretical profile based on local density approximation gives a better agreement with the experimental profile than other reported schemes. The present investigations validate the inclusion of correlation potential of Perdew-Zunger in predicting themore » electronic properties of ZrFe{sub 2}.« less
Motor potential profile and a robust method for extracting it from time series of motor positions.
Wang, Hongyun
2006-10-21
Molecular motors are small, and, as a result, motor operation is dominated by high-viscous friction and large thermal fluctuations from the surrounding fluid environment. The small size has hindered, in many ways, the studies of physical mechanisms of molecular motors. For a macroscopic motor, it is possible to observe/record experimentally the internal operation details of the motor. This is not yet possible for molecular motors. The chemical reaction in a molecular motor has many occupancy states, each having a different effect on the motor motion. The overall effect of the chemical reaction on the motor motion can be characterized by the motor potential profile. The potential profile reveals how the motor force changes with position in a motor step, which may lead to insights into how the chemical reaction is coupled to force generation. In this article, we propose a mathematical formulation and a robust method for constructing motor potential profiles from time series of motor positions measured in single molecule experiments. Numerical examples based on simulated data are shown to demonstrate the method. Interestingly, it is the small size of molecular motors (negligible inertia) that makes it possible to recover the potential profile from time series of motor positions. For a macroscopic motor, the variation of driving force within a cycle is smoothed out by the large inertia.
Padró, Juan M; Osorio-Grisales, Jaiver; Arancibia, Juan A; Olivieri, Alejandro C; Castells, Cecilia B
2015-07-01
Valuable quantitative information could be obtained from strongly overlapped chromatographic profiles of two enantiomers by using proper chemometric methods. Complete separation profiles where the peaks are fully resolved are difficult to achieve in chiral separation methods, and this becomes a particularly severe problem in case that the analyst needs to measure the chiral purity, i.e., when one of the enantiomers is present in the sample in very low concentrations. In this report, we explore the scope of a multivariate chemometric technique based on unfolded partial least-squares regression, as a mathematical tool to solve this quite frequent difficulty. This technique was applied to obtain quantitative results from partially overlapped chromatographic profiles of R- and S-ketoprofen, with different values of enantioresolution factors (from 0.81 down to less than 0.2 resolution units), and also at several different S:R enantiomeric ratios. Enantiomeric purity below 1% was determined with excellent precision even from almost completely overlapped signals. All these assays were tested on the most demanding condition, i.e., when the minor peak elutes immediately after the main peak. The results were validated using univariate calibration of completely resolved profiles and the method applied to the determination of enantiomeric purity of commercial pharmaceuticals. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Real-time and accurate rail wear measurement method and experimental analysis.
Liu, Zhen; Li, Fengjiao; Huang, Bangkui; Zhang, Guangjun
2014-08-01
When a train is running on uneven or curved rails, it generates violent vibrations on the rails. As a result, the light plane of the single-line structured light vision sensor is not vertical, causing errors in rail wear measurements (referred to as vibration errors in this paper). To avoid vibration errors, a novel rail wear measurement method is introduced in this paper, which involves three main steps. First, a multi-line structured light vision sensor (which has at least two linear laser projectors) projects a stripe-shaped light onto the inside of the rail. Second, the central points of the light stripes in the image are extracted quickly, and the three-dimensional profile of the rail is obtained based on the mathematical model of the structured light vision sensor. Then, the obtained rail profile is transformed from the measurement coordinate frame (MCF) to the standard rail coordinate frame (RCF) by taking the three-dimensional profile of the measured rail waist as the datum. Finally, rail wear constraint points are adopted to simplify the location of the rail wear points, and the profile composed of the rail wear points are compared with the standard rail profile in RCF to determine the rail wear. Both real data experiments and simulation experiments show that the vibration errors can be eliminated when the proposed method is used.
Evaluation of deconvolution modelling applied to numerical combustion
NASA Astrophysics Data System (ADS)
Mehl, Cédric; Idier, Jérôme; Fiorina, Benoît
2018-01-01
A possible modelling approach in the large eddy simulation (LES) of reactive flows is to deconvolve resolved scalars. Indeed, by inverting the LES filter, scalars such as mass fractions are reconstructed. This information can be used to close budget terms of filtered species balance equations, such as the filtered reaction rate. Being ill-posed in the mathematical sense, the problem is very sensitive to any numerical perturbation. The objective of the present study is to assess the ability of this kind of methodology to capture the chemical structure of premixed flames. For that purpose, three deconvolution methods are tested on a one-dimensional filtered laminar premixed flame configuration: the approximate deconvolution method based on Van Cittert iterative deconvolution, a Taylor decomposition-based method, and the regularised deconvolution method based on the minimisation of a quadratic criterion. These methods are then extended to the reconstruction of subgrid scale profiles. Two methodologies are proposed: the first one relies on subgrid scale interpolation of deconvolved profiles and the second uses parametric functions to describe small scales. Conducted tests analyse the ability of the method to capture the chemical filtered flame structure and front propagation speed. Results show that the deconvolution model should include information about small scales in order to regularise the filter inversion. a priori and a posteriori tests showed that the filtered flame propagation speed and structure cannot be captured if the filter size is too large.
Derivative based sensitivity analysis of gamma index
Sarkar, Biplab; Pradhan, Anirudh; Ganesh, T.
2015-01-01
Originally developed as a tool for patient-specific quality assurance in advanced treatment delivery methods to compare between measured and calculated dose distributions, the gamma index (γ) concept was later extended to compare between any two dose distributions. It takes into effect both the dose difference (DD) and distance-to-agreement (DTA) measurements in the comparison. Its strength lies in its capability to give a quantitative value for the analysis, unlike other methods. For every point on the reference curve, if there is at least one point in the evaluated curve that satisfies the pass criteria (e.g., δDD = 1%, δDTA = 1 mm), the point is included in the quantitative score as “pass.” Gamma analysis does not account for the gradient of the evaluated curve - it looks at only the minimum gamma value, and if it is <1, then the point passes, no matter what the gradient of evaluated curve is. In this work, an attempt has been made to present a derivative-based method for the identification of dose gradient. A mathematically derived reference profile (RP) representing the penumbral region of 6 MV 10 cm × 10 cm field was generated from an error function. A general test profile (GTP) was created from this RP by introducing 1 mm distance error and 1% dose error at each point. This was considered as the first of the two evaluated curves. By its nature, this curve is a smooth curve and would satisfy the pass criteria for all points in it. The second evaluated profile was generated as a sawtooth test profile (STTP) which again would satisfy the pass criteria for every point on the RP. However, being a sawtooth curve, it is not a smooth one and would be obviously poor when compared with the smooth profile. Considering the smooth GTP as an acceptable profile when it passed the gamma pass criteria (1% DD and 1 mm DTA) against the RP, the first and second order derivatives of the DDs (δD’, δD”) between these two curves were derived and used as the boundary values for evaluating the STTP against the RP. Even though the STTP passed the simple gamma pass criteria, it was found failing at many locations when the derivatives were used as the boundary values. The proposed derivative-based method can identify a noisy curve and can prove to be a useful tool for improving the sensitivity of the gamma index. PMID:26865761
Guthke, Reinhard; Möller, Ulrich; Hoffmann, Martin; Thies, Frank; Töpfer, Susanne
2005-04-15
The immune response to bacterial infection represents a complex network of dynamic gene and protein interactions. We present an optimized reverse engineering strategy aimed at a reconstruction of this kind of interaction networks. The proposed approach is based on both microarray data and available biological knowledge. The main kinetics of the immune response were identified by fuzzy clustering of gene expression profiles (time series). The number of clusters was optimized using various evaluation criteria. For each cluster a representative gene with a high fuzzy-membership was chosen in accordance with available physiological knowledge. Then hypothetical network structures were identified by seeking systems of ordinary differential equations, whose simulated kinetics could fit the gene expression profiles of the cluster-representative genes. For the construction of hypothetical network structures singular value decomposition (SVD) based methods and a newly introduced heuristic Network Generation Method here were compared. It turned out that the proposed novel method could find sparser networks and gave better fits to the experimental data. Reinhard.Guthke@hki-jena.de.
NASA Astrophysics Data System (ADS)
Teffahi, Hanane; Yao, Hongxun; Belabid, Nasreddine; Chaib, Souleyman
2018-02-01
The satellite images with very high spatial resolution have been recently widely used in image classification topic as it has become challenging task in remote sensing field. Due to a number of limitations such as the redundancy of features and the high dimensionality of the data, different classification methods have been proposed for remote sensing images classification particularly the methods using feature extraction techniques. This paper propose a simple efficient method exploiting the capability of extended multi-attribute profiles (EMAP) with sparse autoencoder (SAE) for remote sensing image classification. The proposed method is used to classify various remote sensing datasets including hyperspectral and multispectral images by extracting spatial and spectral features based on the combination of EMAP and SAE by linking them to kernel support vector machine (SVM) for classification. Experiments on new hyperspectral image "Huston data" and multispectral image "Washington DC data" shows that this new scheme can achieve better performance of feature learning than the primitive features, traditional classifiers and ordinary autoencoder and has huge potential to achieve higher accuracy for classification in short running time.
On the Methods of Determining the Radio Emission Geometry in Pulsar Magnetospheres
NASA Technical Reports Server (NTRS)
Dyks, J.; Rudak, B.; Harding, Alice K.
2004-01-01
We present a modification of the relativistic phase shift method of determining the radio emission geometry from pulsar magnetospheres proposed by Gangadhara & Gupta (2001). Our modification provides a method of determining radio emission altitudes which does not depend on the viewing geometry and does not require polarization measurements. We suggest application of the method to the outer edges of averaged radio pulse profiles to identify magnetic field lines associated with'the edges of the pulse and, thereby, to test the geometric method based on the measurement of the pulse width at the lowest intensity level. We show that another relativistic method proposed by Blaskiewicz et al. (1991) provides upper limits for emission altitudes associated with the outer edges of pulse profiles. A comparison of these limits with the altitudes determined with the geometric method may be used to probe the importance of rotational distortions of magnetic field and refraction effects in the pulsar magnetosphere. We provide a comprehensive discussion of the assumptions used in the relativistic methods.
Multisensor Retrieval of Atmospheric Properties.
NASA Astrophysics Data System (ADS)
Boba Stankov, B.
1998-09-01
A new method, Multisensor Retrieval of Atmospheric Properties (MRAP), is presented for deriving vertical profiles of atmospheric parameters throughout the troposphere. MRAP integrates measurements from multiple, diverse, remote sensing, and in situ instruments, the combination of which provides better capabilities than any instrument alone. Since remote sensors can deliver measurements automatically and continuously with high time resolution, MRAP provides better coverage than traditional rawinsondes. MRAP's design is flexible, being capable of incorporating measurements from different instruments in order to take advantage of new or developing advanced sensor technology. Furthermore, new or alternative atmospheric parameters for a variety of applications may be easily added as products of MRAP.A combination of passive radiometric, active radar, and in situ observations provide the best temperature and humidity profile measurements. Therefore, MRAP starts with a traditional, radiometer-based, physical retrieval algorithm provided by the International TOVS (TIROS-N Operational Vertical Sounder) Processing Package (ITPP) that constrains the retrieved profiles to agree with brightness temperature measurements. The first-guess profiles required by the ITPP's iterative retrieval algorithm are obtained by using a statistical inversion technique and ground-based remote sensing measurements. Because the individual ground-based remote sensing measurements are usually of sufficiently high quality, the first-guess profiles by themselves provide a satisfactory solution to establish the atmospheric water vapor and temperature state, and the TOVS data are included to provide profiles with better accuracy at higher levels, MRAP provides a physically consistent mechanism for combining the ground- and space-based humidity and temperature profiles.Data that have been used successfully to retrieve humidity and temperature profiles with MRAP are the following: temperature profiles in the lower troposphere from the ground-based Radio Acoustic Sounding System (RASS); total water vapor measurements from the Global Positioning System; specific humidity gradient profiles from the wind-profiling radar/RASS system; surface meteorological observations from standard instruments; cloud-base heights from a lidar ceilometer; temperature from the Aeronautical Radio, Incorporated Communication, Addressing and Reporting System aboard commercial airlines; and brightness temperature observations from TOVS.Data from the experiment conducted in the late summer of 1995 at Point Loma, California, were used for comparisons of MRAP results and 20 nearby rawinsonde releases to assess the statistical error estimates of MRAP. The temperature profiles had a bias of -0.27°C and a standard deviation of 1.56°C for the entire troposphere. Dewpoint profile retrievals did not have an overall accuracy as high as that of the temperature profiles but they exhibited a markedly improved standard deviation and bias in the lower atmosphere when the wind profiler/RASS specific humidity gradient information was available as a further constraint on the process. The European Centre for Medium-Range Weather Forecasts (ECMWF) model profiles of humidity and temperature for the grid point nearest to the Point Loma site were also used for comparison with the rawinsonde soundings to establish the usefulness of MRAP profiles to the weather forecasting community. The comparison showed that the vertical resolution of the ECMWF model profiles within the planetary boundary layer is not capable of detecting sharp gradients.
A porewater-based stable isotope approach for the investigation of subsurface hydrological processes
NASA Astrophysics Data System (ADS)
Garvelmann, J.; Külls, C.; Weiler, M.
2012-02-01
Predicting and understanding subsurface flowpaths is still a crucial issue in hydrological research. We present an experimental approach to reveal present and past subsurface flowpaths of water in the unsaturated and saturated zone. Two hillslopes in a humid mountainous catchment have been investigated. The H2O(liquid) - H2O(vapor) equilibration laser spectroscopy method was used to obtain high resolution δ2H vertical depth profiles of pore water at various points along two fall lines of a pasture hillslope in the southern Black Forest, Germany. The Porewater-based Stable Isotope Profile (PSIP) approach was developed to use the integrated information of several vertical depth profiles of deuterium along transects at the hillslope. Different shapes of depth profiles were observed in relation to hillslope position. The statistical variability (inter-quartile range and standard deviation) of each profile was used to characterize different types of depth profiles. The profiles upslope or with a weak affinity for saturation as indicated by a low topographic wetness index preserve the isotopic input signal by precipitation with a distinct seasonal variability. These observations indicate mainly vertical movement of soil water in the upper part of the hillslope before sampling. The profiles downslope or at locations with a strong affinity for saturation do not show a similar seasonal isotopic signal. The input signal is erased in the foothills and a large proportion of pore water samples are close to the isotopic values of δ2H in streamwater during base flow conditions indicating the importance of the groundwater component in the catchment. Near the stream indications for efficient mixing of water from lateral subsurface flow paths with vertical percolation are found.
NASA Astrophysics Data System (ADS)
Kreutzer, Sebastian; Meszner, Sascha; Faust, Dominik; Fuchs, Markus
2014-05-01
Interpreting former landscape evolution asks for understanding the processes that sculpt such landforms by means of deciphering complex systems. For reconstructing terrestrial Quaternary environments based on loess archives this might be considered, at least, as a three step process: (1) Identifying valuable records in appropriate morphological positions in a previously defined research area, (2) analysing the profiles by field work and laboratory methods and finally (3) linking the previously considered pseudo-isolated systems to set up a comprehensive picture. Especially the first and the last step might bring some pitfalls, as it is tempting to specify single records as pseudo-isolated, closed systems. They might be, with regard to their preservation in their specific morphological position, but in fact they are part of a complex, open system. Between 2008 and 2013, Late-Pleistocene loess archives in Saxony have been intensively investigated by field and laboratory methods. Linking pedo- and luminescence dating based chronostratigraphies, a composite profile for the entire Saxonian Loess Region has been established. With this, at least, two-fold approach we tried to avoid misinterpretations that might appear when focussing on one standard profile in an open morphological system. Our contribution focuses on this multi-proxy approach to decipher the Late-Pleistocene landscape evolution in the Saxonian Loess Region. Highlighting the challenges and advantages of combining different methods, we believe that (1) this multi-proxy approach is without alternative, (2) the combination of different profiles may simplify the more complex reality, but it may be a useful generalisation to understand and reveal the stratigraphical significance of the landscape evolution in this region.
Permeability profiles in granular aquifers using flowmeters in direct-push wells
Paradis, D.; Lefebvre, R.; Morin, R.H.; Gloaguen, E.
2010-01-01
Numerical hydrogeological models should ideally be based on the spatial distribution of hydraulic conductivity (K), a property rarely defined on the basis of sufficient data due to the lack of efficient characterization methods. Electromagnetic borehole flowmeter measurements during pumping in uncased wells can effectively provide a continuous vertical distribution of K in consolidated rocks. However, relatively few studies have used the flowmeter in screened wells penetrating unconsolidated aquifers, and tests conducted in gravel-packed wells have shown that flowmeter data may yield misleading results. This paper describes the practical application of flowmeter profiles in direct-push wells to measure K and delineate hydrofacies in heterogeneous unconsolidated aquifers having low-to-moderate K (10−6 to 10−4 m/s). The effect of direct-push well installation on K measurements in unconsolidated deposits is first assessed based on the previous work indicating that such installations minimize disturbance to the aquifer fabric. The installation and development of long-screen wells are then used in a case study validating Kprofiles from flowmeter tests at high-resolution intervals (15 cm) with K profiles derived from multilevel slug tests between packers at identical intervals. For 119 intervals tested in five different wells, the difference in log K values obtained from the two methods is consistently below 10%. Finally, a graphical approach to the interpretation of flowmeter profiles is proposed to delineate intervals corresponding to distinct hydrofacies, thus providing a method whereby both the scale and magnitude of K contrasts in heterogeneous unconsolidated aquifers may be represented.
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
NASA Astrophysics Data System (ADS)
Arabanian, Atoosa Sadat; Najafi, Somayeh; Ajami, Aliasghar; Husinsky, Wolfgang; Massudi, Reza
2018-02-01
We have succeeded in realizing a method to control the spatial distribution of optical retardation as a result of nanogratings in bulk-fused silica induced by ultrashort laser pulses. A colorimetry-based retardation measurement (CBRM) based on the Michel-Levy interference color chart using a polarization microscope is used to determine the profiles of the optical retardation. Effects of the spatial overlap of written regions as well as the energy and polarization of the writing pulses on the induced retardations are studied. It has been found that the spatial overlap of lines written by pulse trains with different energies and polarizations can result in an adjustment of the induced birefringence in the overlap region. This approach offers the possibility of designing polarization-sensitive components with a desired birefringence profile.
Design and characteristic analysis of shaping optics for optical trepanning
NASA Astrophysics Data System (ADS)
Zeng, D.; Latham, W. P.; Kar, A.
2005-08-01
Optical trepanning is a new laser drilling method using an annular beam. The annular beams allow numerous irradiance profiles to supply laser energy to the workpiece and thus provide more flexibility in affecting the hole quality than a traditional circular laser beam. The refractive axicon system has been designed to generating a collimated annular beam. In this article, calculations of intensity distributions produced by this refractive system are made by evaluating the Kirchhoff-Fresnel diffraction. It is shown that the refractive system is able to transform a Gaussian beam into a full Gaussian annular beam. The base angle of the axicon lens, input laser beam diameter and intensity profiles are found to be important factors for the axcion refractive system. Their effects on the annular beam profiles are analyzed based on the numerical solutions of the diffraction patterns.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santi, A.; Piacentini, G.; Zanichelli, M.
2014-05-12
A method for reconstructing the spatial profile of the electric field along the thickness of a generic bulk solid-state photodetector is proposed. Furthermore, the mobility and lifetime of both electrons and holes can be evaluated contextually. The method is based on a procedure of minimization built up from current transient profiles induced by laser pulses in a planar detector at different applied voltages. The procedure was tested in CdTe planar detectors for X- and Gamma rays. The devices were measured in a single-carrier transport configuration by impinging laser light on the sample cathode. This method could be suitable for manymore » other devices provided that they are made of materials with sufficiently high resistivity, i.e., with a sufficiently low density of intrinsic carriers.« less
Applications of Some Artificial Intelligence Methods to Satellite Soundings
NASA Technical Reports Server (NTRS)
Munteanu, M. J.; Jakubowicz, O.
1985-01-01
Hard clustering of temperature profiles and regression temperature retrievals were used to refine the method using the probabilities of membership of each pattern vector in each of the clusters derived with discriminant analysis. In hard clustering the maximum probability is taken and the corresponding cluster as the correct cluster are considered discarding the rest of the probabilities. In fuzzy partitioned clustering these probabilities are kept and the final regression retrieval is a weighted regression retrieval of several clusters. This method was used in the clustering of brightness temperatures where the purpose was to predict tropopause height. A further refinement is the division of temperature profiles into three major regions for classification purposes. The results are summarized in the tables total r.m.s. errors are displayed. An approach based on fuzzy logic which is intimately related to artificial intelligence methods is recommended.
Extracting physicochemical features to predict protein secondary structure.
Huang, Yin-Fu; Chen, Shu-Ying
2013-01-01
We propose a protein secondary structure prediction method based on position-specific scoring matrix (PSSM) profiles and four physicochemical features including conformation parameters, net charges, hydrophobic, and side chain mass. First, the SVM with the optimal window size and the optimal parameters of the kernel function is found. Then, we train the SVM using the PSSM profiles generated from PSI-BLAST and the physicochemical features extracted from the CB513 data set. Finally, we use the filter to refine the predicted results from the trained SVM. For all the performance measures of our method, Q 3 reaches 79.52, SOV94 reaches 86.10, and SOV99 reaches 74.60; all the measures are higher than those of the SVMpsi method and the SVMfreq method. This validates that considering these physicochemical features in predicting protein secondary structure would exhibit better performances.
Extracting Physicochemical Features to Predict Protein Secondary Structure
Chen, Shu-Ying
2013-01-01
We propose a protein secondary structure prediction method based on position-specific scoring matrix (PSSM) profiles and four physicochemical features including conformation parameters, net charges, hydrophobic, and side chain mass. First, the SVM with the optimal window size and the optimal parameters of the kernel function is found. Then, we train the SVM using the PSSM profiles generated from PSI-BLAST and the physicochemical features extracted from the CB513 data set. Finally, we use the filter to refine the predicted results from the trained SVM. For all the performance measures of our method, Q 3 reaches 79.52, SOV94 reaches 86.10, and SOV99 reaches 74.60; all the measures are higher than those of the SVMpsi method and the SVMfreq method. This validates that considering these physicochemical features in predicting protein secondary structure would exhibit better performances. PMID:23766688
Cross correlation of chemical profiles in minerals: Technical issues and numerical methods
NASA Astrophysics Data System (ADS)
Probst, Line; Caricchi, Luca; Gander, Martin; Wallace, Glen; Sheldrake, Tom
2017-04-01
Crystals grown in magma reservoirs and develop chemical zoning because of the lack of re-equilibration when thermodynamic conditions change. Therefore, the study of chemical zoning in minerals offers the opportunity to reconstruct the pre-eruptive conditions and the temporal evolution of magma reservoirs. We are building a quantitative method that allows the comparison between zonation patterns within minerals. The aim of this method is to identify if similar crystal have partially similar zonation patterns and thus shared a part of their growth history. Our method is based on the correlation method developed first by G. Wallace and G. Bergantz (2004). Here we present some technical issues linked to the use of a numerical method to compare crystals within their textural context in thin sections. The first issue is related to the acquisition of chemical profiles from images of thin sections (e.g. BSE or cathodoluminescence images). We present a new procedure that significantly improves both image and profile processing. A second issue is related to the random orientation of crystals in a thin section. The software we are building takes in account different orientation of crystals by applying different stretching factors to chemical profiles. Thus the automated selection of the best stretching factor is crucial for the rest of the procedure. The last point is the significance level, the threshold above which the correlation between two profiles is considered as real (and not random). This threshold must also be carefully defined and justified. All these points were studied with statistical analysis and we present results leading to a more reliable and robust method. [1] Wallace, G.S. and Bergantz, G.W., 2004. Constraints on mingling of crystal populations from off-center zoning profiles: A statistical approach. American Mineralogist, vol. 89 (1), pp. 64-73. [2] Wallace, G.S. and Bergantz, G.W., 2004. Reconciling heterogeneity in crystal zoning data: An application of shared characteristic diagrams at Chaos Crags, Lassen Volcanic Center, California. Contributions to Mineralogy and Petrology, vol. 149, pp. 98-112.
Voxel-Based 3-D Tree Modeling from Lidar Images for Extracting Tree Structual Information
NASA Astrophysics Data System (ADS)
Hosoi, F.
2014-12-01
Recently, lidar (light detection and ranging) has been used to extracting tree structural information. Portable scanning lidar systems can capture the complex shape of individual trees as a 3-D point-cloud image. 3-D tree models reproduced from the lidar-derived 3-D image can be used to estimate tree structural parameters. We have proposed the voxel-based 3-D modeling for extracting tree structural parameters. One of the tree parameters derived from the voxel modeling is leaf area density (LAD). We refer to the method as the voxel-based canopy profiling (VCP) method. In this method, several measurement points surrounding the canopy and optimally inclined laser beams are adopted for full laser beam illumination of whole canopy up to the internal. From obtained lidar image, the 3-D information is reproduced as the voxel attributes in the 3-D voxel array. Based on the voxel attributes, contact frequency of laser beams on leaves is computed and LAD in each horizontal layer is obtained. This method offered accurate LAD estimation for individual trees and woody canopy trees. For more accurate LAD estimation, the voxel model was constructed by combining airborne and portable ground-based lidar data. The profiles obtained by the two types of lidar complemented each other, thus eliminating blind regions and yielding more accurate LAD profiles than could be obtained by using each type of lidar alone. Based on the estimation results, we proposed an index named laser beam coverage index, Ω, which relates to the lidar's laser beam settings and a laser beam attenuation factor. It was shown that this index can be used for adjusting measurement set-up of lidar systems and also used for explaining the LAD estimation error using different types of lidar systems. Moreover, we proposed a method to estimate woody material volume as another application of the voxel tree modeling. In this method, voxel solid model of a target tree was produced from the lidar image, which is composed of consecutive voxels that filled the outer surface and the interior of the stem and large branches. From the model, the woody material volume of any part of the target tree can be directly calculated easily by counting the number of corresponding voxels and multiplying the result by the per-voxel volume.
A new momentum integral method for approximating bed shear stress
NASA Astrophysics Data System (ADS)
Wengrove, M. E.; Foster, D. L.
2016-02-01
In nearshore environments, accurate estimation of bed stress is critical to estimate morphologic evolution, and benthic mass transfer fluxes. However, bed shear stress over mobile boundaries in wave environments is notoriously difficult to estimate due to the non-equilibrium boundary layer. Approximating the friction velocity with a traditional logarithmic velocity profile model is common, but an unsteady non-uniform flow field violates critical assumptions in equilibrium boundary layer theory. There have been several recent developments involving stress partitioning through an examination of the momentum transfer contributions that lead to improved estimates of the bed stress. For the case of single vertical profile observations, Mehdi et al. (2014) developed a full momentum integral-based method for steady-unidirectional flow that integrates the streamwise Navier-Stokes equation three times to an arbitrary position within the boundary layer. For the case of two-dimensional velocity observations, Rodriguez-Abudo and Foster (2014) were able to examine the momentum contributions from waves, turbulence and the bedform in a spatial and temporal averaging approach to the Navier-Stokes equations. In this effort, the above methods are combined to resolve the bed shear stress in both short and long wave dominated environments with a highly mobile bed. The confluence is an integral based approach for determining bed shear stress that makes no a-priori assumptions of boundary layer shape and uses just a single velocity profile time series for both the phase dependent case (under waves) and the unsteady case (under solitary waves). The developed method is applied to experimental observations obtained in a full scale laboratory investigation (Oregon State's Large Wave Flume) of the nearbed velocity field over a rippled sediment bed in oscillatory flow using both particle image velocimetry and a profiling acoustic Doppler velocimeter. This method is particularly relevant for small scale field observations and laboratory observations.