ERIC Educational Resources Information Center
Hampden-Thompson, Gillian; Lubben, Fred; Bennett, Judith
2011-01-01
Quantitative secondary analysis of large-scale data can be combined with in-depth qualitative methods. In this paper, we discuss the role of this combined methods approach in examining the uptake of physics and chemistry in post compulsory schooling for students in England. The secondary data analysis of the National Pupil Database (NPD) served…
NASA Astrophysics Data System (ADS)
Xu, Jing; Liu, Xiaofei; Wang, Yutian
2016-08-01
Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components.
Xu, Jing; Liu, Xiaofei; Wang, Yutian
2016-08-05
Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components. Copyright © 2016 Elsevier B.V. All rights reserved.
Using Robust Standard Errors to Combine Multiple Regression Estimates with Meta-Analysis
ERIC Educational Resources Information Center
Williams, Ryan T.
2012-01-01
Combining multiple regression estimates with meta-analysis has continued to be a difficult task. A variety of methods have been proposed and used to combine multiple regression slope estimates with meta-analysis, however, most of these methods have serious methodological and practical limitations. The purpose of this study was to explore the use…
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
[Combine fats products: methodic opportunities of it identification].
Viktorova, E V; Kulakova, S N; Mikhaĭlov, N A
2006-01-01
At present time very topical problem is falsification of milk fat. The number of methods was considered to detection of milk fat authention and possibilities his difference from combined fat products. The analysis of modern approaches to valuation of milk fat authention has showed that the main method for detection of fat nature is gas chromatography analysis. The computer method of express identification of fat products is proposed for quick getting of information about accessory of examine fat to nature milk or combined fat product.
Accuracy Evaluation of the Unified P-Value from Combining Correlated P-Values
Alves, Gelio; Yu, Yi-Kuo
2014-01-01
Meta-analysis methods that combine -values into a single unified -value are frequently employed to improve confidence in hypothesis testing. An assumption made by most meta-analysis methods is that the -values to be combined are independent, which may not always be true. To investigate the accuracy of the unified -value from combining correlated -values, we have evaluated a family of statistical methods that combine: independent, weighted independent, correlated, and weighted correlated -values. Statistical accuracy evaluation by combining simulated correlated -values showed that correlation among -values can have a significant effect on the accuracy of the combined -value obtained. Among the statistical methods evaluated those that weight -values compute more accurate combined -values than those that do not. Also, statistical methods that utilize the correlation information have the best performance, producing significantly more accurate combined -values. In our study we have demonstrated that statistical methods that combine -values based on the assumption of independence can produce inaccurate -values when combining correlated -values, even when the -values are only weakly correlated. Therefore, to prevent from drawing false conclusions during hypothesis testing, our study advises caution be used when interpreting the -value obtained from combining -values of unknown correlation. However, when the correlation information is available, the weighting-capable statistical method, first introduced by Brown and recently modified by Hou, seems to perform the best amongst the methods investigated. PMID:24663491
Doytchev, Doytchin E; Szwillus, Gerd
2009-11-01
Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.
Zou, Ling; Chen, Shuyue; Sun, Yuqiang; Ma, Zhenghua
2010-08-01
In this paper we present a new method of combining Independent Component Analysis (ICA) and Wavelet de-noising algorithm to extract Evoked Related Potentials (ERPs). First, the extended Infomax-ICA algorithm is used to analyze EEG signals and obtain the independent components (Ics); Then, the Wave Shrink (WS) method is applied to the demixed Ics as an intermediate step; the EEG data were rebuilt by using the inverse ICA based on the new Ics; the ERPs were extracted by using de-noised EEG data after being averaged several trials. The experimental results showed that the combined method and ICA method could remove eye artifacts and muscle artifacts mixed in the ERPs, while the combined method could retain the brain neural activity mixed in the noise Ics and could extract the weak ERPs efficiently from strong background artifacts.
ERIC Educational Resources Information Center
Yamaguchi, Yusuke; Sakamoto, Wataru; Goto, Masashi; Staessen, Jan A.; Wang, Jiguang; Gueyffier, Francois; Riley, Richard D.
2014-01-01
When some trials provide individual patient data (IPD) and the others provide only aggregate data (AD), meta-analysis methods for combining IPD and AD are required. We propose a method that reconstructs the missing IPD for AD trials by a Bayesian sampling procedure and then applies an IPD meta-analysis model to the mixture of simulated IPD and…
Selecting supplier combination based on fuzzy multicriteria analysis
NASA Astrophysics Data System (ADS)
Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.
2015-07-01
Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.
USDA-ARS?s Scientific Manuscript database
Visible/near-infrared (Vis/NIR) spectroscopy with wavelength range between 400 and 2500 nm combined with factor analysis method was tested to predict quality attributes of chicken breast fillets. Quality attributes, including color (L*, a*, b*), pH, and drip loss were analyzed using factor analysis ...
Combined magnetic and gravity analysis
NASA Technical Reports Server (NTRS)
Hinze, W. J.; Braile, L. W.; Chandler, V. W.; Mazella, F. E.
1975-01-01
Efforts are made to identify methods of decreasing magnetic interpretation ambiguity by combined gravity and magnetic analysis, to evaluate these techniques in a preliminary manner, to consider the geologic and geophysical implications of correlation, and to recommend a course of action to evaluate methods of correlating gravity and magnetic anomalies. The major thrust of the study was a search and review of the literature. The literature of geophysics, geology, geography, and statistics was searched for articles dealing with spatial correlation of independent variables. An annotated bibliography referencing the Germane articles and books is presented. The methods of combined gravity and magnetic analysis techniques are identified and reviewed. A more comprehensive evaluation of two types of techniques is presented. Internal correspondence of anomaly amplitudes is examined and a combined analysis is done utilizing Poisson's theorem. The geologic and geophysical implications of gravity and magnetic correlation based on both theoretical and empirical relationships are discussed.
Buckling analysis for anisotropic laminated plates under combined inplane loads
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Tamekuni, M.; Baker, L. L.
1974-01-01
The buckling analysis presented considers rectangular flat or curved general laminates subjected to combined inplane normal and shear loads. Linear theory is used in the analysis. All prebuckling deformations and any initial imperfections are ignored. The analysis method can be readily extended to longitudinally stiffened structures subjected to combined inplane normal and shear loads.
ERIC Educational Resources Information Center
Abrams, Neal M.
2012-01-01
A cloud network system is combined with standard computing applications and a course management system to provide a robust method for sharing data among students. This system provides a unique method to improve data analysis by easily increasing the amount of sampled data available for analysis. The data can be shared within one course as well as…
NASA Astrophysics Data System (ADS)
Hu, Zhan; Zheng, Gangtie
2016-08-01
A combined analysis method is developed in the present paper for studying the dynamic properties of a type of geometrically nonlinear vibration isolator, which is composed of push-pull configuration rings. This method combines the geometrically nonlinear theory of curved beams and the Harmonic Balance Method to overcome the difficulty in calculating the vibration and vibration transmissibility under large deformations of the ring structure. Using the proposed method, nonlinear dynamic behaviors of this isolator, such as the lock situation due to the coulomb damping and the usual jump resulting from the nonlinear stiffness, can be investigated. Numerical solutions based on the primary harmonic balance are first verified by direct integration results. Then, the whole procedure of this combined analysis method is demonstrated and validated by slowly sinusoidal sweeping experiments with different amplitudes of the base excitation. Both numerical and experimental results indicate that this type of isolator behaves as a hardening spring with increasing amplitude of the base excitation, which makes it suitable for isolating both steady-state vibrations and transient shocks.
A computer program for the design and analysis of low-speed airfoils
NASA Technical Reports Server (NTRS)
Eppler, R.; Somers, D. M.
1980-01-01
A conformal mapping method for the design of airfoils with prescribed velocity distribution characteristics, a panel method for the analysis of the potential flow about given airfoils, and a boundary layer method have been combined. With this combined method, airfoils with prescribed boundary layer characteristics can be designed and airfoils with prescribed shapes can be analyzed. All three methods are described briefly. The program and its input options are described. A complete listing is given as an appendix.
ERIC Educational Resources Information Center
Nivens, Delana A.; Padgett, Clifford W.; Chase, Jeffery M.; Verges, Katie J.; Jamieson, Deborah S.
2010-01-01
Case studies and current literature are combined with spectroscopic analysis to provide a unique chemistry experience for art history students and to provide a unique inquiry-based laboratory experiment for analytical chemistry students. The XRF analysis method was used to demonstrate to nonscience majors (art history students) a powerful…
Hao, Yong; Sun, Xu-Dong; Yang, Qiang
2012-12-01
Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.
Estimation of Handgrip Force from SEMG Based on Wavelet Scale Selection.
Wang, Kai; Zhang, Xianmin; Ota, Jun; Huang, Yanjiang
2018-02-24
This paper proposes a nonlinear correlation-based wavelet scale selection technology to select the effective wavelet scales for the estimation of handgrip force from surface electromyograms (SEMG). The SEMG signal corresponding to gripping force was collected from extensor and flexor forearm muscles during the force-varying analysis task. We performed a computational sensitivity analysis on the initial nonlinear SEMG-handgrip force model. To explore the nonlinear correlation between ten wavelet scales and handgrip force, a large-scale iteration based on the Monte Carlo simulation was conducted. To choose a suitable combination of scales, we proposed a rule to combine wavelet scales based on the sensitivity of each scale and selected the appropriate combination of wavelet scales based on sequence combination analysis (SCA). The results of SCA indicated that the scale combination VI is suitable for estimating force from the extensors and the combination V is suitable for the flexors. The proposed method was compared to two former methods through prolonged static and force-varying contraction tasks. The experiment results showed that the root mean square errors derived by the proposed method for both static and force-varying contraction tasks were less than 20%. The accuracy and robustness of the handgrip force derived by the proposed method is better than that obtained by the former methods.
Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H
2018-01-30
Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Effects of Problem-Based Learning on Attitude: A Meta-Analysis Study
ERIC Educational Resources Information Center
Demirel, Melek; Dagyar, Miray
2016-01-01
To date, researchers have frequently investigated students' attitudes toward courses supported by problem-based learning. There are several studies with different results in the literature. It is necessary to combine and interpret the findings of these studies through a meta-analysis method. This method aims to combine different results of similar…
Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne
2016-01-05
In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.
A study on quantifying COPD severity by combining pulmonary function tests and CT image analysis
NASA Astrophysics Data System (ADS)
Nimura, Yukitaka; Kitasaka, Takayuki; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku
2011-03-01
This paper describes a novel method that can evaluate chronic obstructive pulmonary disease (COPD) severity by combining measurements of pulmonary function tests and measurements obtained from CT image analysis. There is no cure for COPD. However, with regular medical care and consistent patient compliance with treatments and lifestyle changes, the symptoms of COPD can be minimized and progression of the disease can be slowed. Therefore, many diagnosis methods based on CT image analysis have been proposed for quantifying COPD. Most of diagnosis methods for COPD extract the lesions as low-attenuation areas (LAA) by thresholding and evaluate the COPD severity by calculating the LAA in the lung (LAA%). However, COPD is usually the result of a combination of two conditions, emphysema and chronic obstructive bronchitis. Therefore, the previous methods based on only LAA% do not work well. The proposed method utilizes both of information including the measurements of pulmonary function tests and the results of the chest CT image analysis to evaluate the COPD severity. In this paper, we utilize a multi-class AdaBoost to combine both of information and classify the COPD severity into five stages automatically. The experimental results revealed that the accuracy rate of the proposed method was 88.9% (resubstitution scheme) and 64.4% (leave-one-out scheme).
Ruzik, L; Obarski, N; Papierz, A; Mojski, M
2015-06-01
High-performance liquid chromatography (HPLC) with UV/VIS spectrophotometric detection combined with the chemometric method of cluster analysis (CA) was used for the assessment of repeatability of composition of nine types of perfumed waters. In addition, the chromatographic method of separating components of the perfume waters under analysis was subjected to an optimization procedure. The chromatograms thus obtained were used as sources of data for the chemometric method of cluster analysis (CA). The result was a classification of a set comprising 39 perfumed water samples with a similar composition at a specified level of probability (level of agglomeration). A comparison of the classification with the manufacturer's declarations reveals a good degree of consistency and demonstrates similarity between samples in different classes. A combination of the chromatographic method with cluster analysis (HPLC UV/VIS - CA) makes it possible to quickly assess the repeatability of composition of perfumed waters at selected levels of probability. © 2014 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Mohammadi, Amrollah; Ahmadian, Alireza; Rabbani, Shahram; Fattahi, Ehsan; Shirani, Shapour
2017-12-01
Finite element models for estimation of intraoperative brain shift suffer from huge computational cost. In these models, image registration and finite element analysis are two time-consuming processes. The proposed method is an improved version of our previously developed Finite Element Drift (FED) registration algorithm. In this work the registration process is combined with the finite element analysis. In the Combined FED (CFED), the deformation of whole brain mesh is iteratively calculated by geometrical extension of a local load vector which is computed by FED. While the processing time of the FED-based method including registration and finite element analysis was about 70 s, the computation time of the CFED was about 3.2 s. The computational cost of CFED is almost 50% less than similar state of the art brain shift estimators based on finite element models. The proposed combination of registration and structural analysis can make the calculation of brain deformation much faster. Copyright © 2016 John Wiley & Sons, Ltd.
Combined Raman spectroscopy and autofluoresence imaging method for in vivo skin tumor diagnosis
NASA Astrophysics Data System (ADS)
Zakharov, V. P.; Bratchenko, I. A.; Myakinin, O. O.; Artemyev, D. N.; Khristoforova, Y. A.; Kozlov, S. V.; Moryatov, A. A.
2014-09-01
The fluorescence and Raman spectroscopy (RS) combined method of in vivo detection of malignant human skin cancer was demonstrated. The fluorescence analysis was used for detection of abnormalities during fast scanning of large tissue areas. In suspected cases of malignancy the Raman spectrum analysis of biological tissue was performed to determine the type of neoplasm. A special RS phase method was proposed for in vivo identification of skin tumor. Quadratic Discriminant Analysis was used for tumor type classification on phase planes. It was shown that the application of phase method provides a diagnosis of malignant melanoma with a sensitivity of 89% and a specificity of 87%.
Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S
2008-05-09
Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.
An Adaptive Cross-Architecture Combination Method for Graph Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Yang; Song, Shuaiwen; Kerbyson, Darren J.
2014-06-18
Breadth-First Search (BFS) is widely used in many real-world applications including computational biology, social networks, and electronic design automation. The combination method, using both top-down and bottom-up techniques, is the most effective BFS approach. However, current combination methods rely on trial-and-error and exhaustive search to locate the optimal switching point, which may cause significant runtime overhead. To solve this problem, we design an adaptive method based on regression analysis to predict an optimal switching point for the combination method at runtime within less than 0.1% of the BFS execution time.
Combining 1D and 2D linear discriminant analysis for palmprint recognition
NASA Astrophysics Data System (ADS)
Zhang, Jian; Ji, Hongbing; Wang, Lei; Lin, Lin
2011-11-01
In this paper, a novel feature extraction method for palmprint recognition termed as Two-dimensional Combined Discriminant Analysis (2DCDA) is proposed. By connecting the adjacent rows of a image sequentially, the obtained new covariance matrices contain the useful information among local geometry structures in the image, which is eliminated by 2DLDA. In this way, 2DCDA combines LDA and 2DLDA for a promising recognition accuracy, but the number of coefficients of its projection matrix is lower than that of other two-dimensional methods. Experimental results on the CASIA palmprint database demonstrate the effectiveness of the proposed method.
Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang
2017-05-01
A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Zakharov, V. P.; Bratchenko, I. A.; Artemyev, D. N.; Myakinin, O. O.; Khristoforova, Y. A.; Kozlov, S. V.; Moryatov, A. A.
2015-07-01
The combined application of Raman and autofluorescence spectroscopy in visible and near infrared regions for the analysis of malignant neoplasms of human skin was demonstrated. Ex vivo experiments were performed for 130 skin tissue samples: 28 malignant melanomas, 19 basal cell carcinomas, 15 benign tumors, 9 nevi and 59 normal tissues. Proposed method of Raman spectra analysis allows for malignant melanoma differentiating from other skin tissues with accuracy of 84% (sensitivity of 97%, specificity of 72%). Autofluorescence analysis in near infrared and visible regions helped us to increase the diagnostic accuracy by 5-10%. Registration of autofluorescence in near infrared region is realized in one optical unit with Raman spectroscopy. Thus, the proposed method of combined skin tissues study makes possible simultaneous large skin area study with autofluorescence spectra analysis and precise neoplasm type determination with Raman spectroscopy.
NASA Astrophysics Data System (ADS)
YangDai, Tianyi; Zhang, Li
2016-02-01
Energy dispersive X-ray diffraction (EDXRD) combined with hybrid discriminant analysis (HDA) has been utilized for classifying the liquid materials for the first time. The XRD spectra of 37 kinds of liquid contrabands and daily supplies were obtained using an EDXRD test bed facility. The unique spectra of different samples reveal XRD's capability to distinguish liquid contrabands from daily supplies. In order to create a system to detect liquid contrabands, the diffraction spectra were subjected to HDA which is the combination of principal components analysis (PCA) and linear discriminant analysis (LDA). Experiments based on the leave-one-out method demonstrate that HDA is a practical method with higher classification accuracy and lower noise sensitivity than the other methods in this application. The study shows the great capability and potential of the combination of XRD and HDA for liquid contrabands classification.
Bai, Ou; Lin, Peter; Vorbach, Sherry; Li, Jiang; Furlani, Steve; Hallett, Mark
2007-12-01
To explore effective combinations of computational methods for the prediction of movement intention preceding the production of self-paced right and left hand movements from single trial scalp electroencephalogram (EEG). Twelve naïve subjects performed self-paced movements consisting of three key strokes with either hand. EEG was recorded from 128 channels. The exploration was performed offline on single trial EEG data. We proposed that a successful computational procedure for classification would consist of spatial filtering, temporal filtering, feature selection, and pattern classification. A systematic investigation was performed with combinations of spatial filtering using principal component analysis (PCA), independent component analysis (ICA), common spatial patterns analysis (CSP), and surface Laplacian derivation (SLD); temporal filtering using power spectral density estimation (PSD) and discrete wavelet transform (DWT); pattern classification using linear Mahalanobis distance classifier (LMD), quadratic Mahalanobis distance classifier (QMD), Bayesian classifier (BSC), multi-layer perceptron neural network (MLP), probabilistic neural network (PNN), and support vector machine (SVM). A robust multivariate feature selection strategy using a genetic algorithm was employed. The combinations of spatial filtering using ICA and SLD, temporal filtering using PSD and DWT, and classification methods using LMD, QMD, BSC and SVM provided higher performance than those of other combinations. Utilizing one of the better combinations of ICA, PSD and SVM, the discrimination accuracy was as high as 75%. Further feature analysis showed that beta band EEG activity of the channels over right sensorimotor cortex was most appropriate for discrimination of right and left hand movement intention. Effective combinations of computational methods provide possible classification of human movement intention from single trial EEG. Such a method could be the basis for a potential brain-computer interface based on human natural movement, which might reduce the requirement of long-term training. Effective combinations of computational methods can classify human movement intention from single trial EEG with reasonable accuracy.
Risk Analysis Methods for Deepwater Port Oil Transfer Systems
DOT National Transportation Integrated Search
1976-06-01
This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...
ERIC Educational Resources Information Center
Lal, Shalini; Suto, Melinda; Ungar, Michael
2012-01-01
Increasingly, qualitative researchers are combining methods, processes, and principles from two or more methodologies over the course of a research study. Critics charge that researchers adopting combined approaches place too little attention on the historical, epistemological, and theoretical aspects of the research design. Rather than…
Application of the pulsed fast/thermal neutron method for soil elemental analysis
USDA-ARS?s Scientific Manuscript database
Soil science is a research field where physic concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. Different methods of analysis are currently available. The evolution of nuclear physics (methodology and instrumentation) combined with the ava...
Hanriot, Lucie; Keime, Céline; Gay, Nadine; Faure, Claudine; Dossat, Carole; Wincker, Patrick; Scoté-Blachon, Céline; Peyron, Christelle; Gandrillon, Olivier
2008-01-01
Background "Open" transcriptome analysis methods allow to study gene expression without a priori knowledge of the transcript sequences. As of now, SAGE (Serial Analysis of Gene Expression), LongSAGE and MPSS (Massively Parallel Signature Sequencing) are the mostly used methods for "open" transcriptome analysis. Both LongSAGE and MPSS rely on the isolation of 21 pb tag sequences from each transcript. In contrast to LongSAGE, the high throughput sequencing method used in MPSS enables the rapid sequencing of very large libraries containing several millions of tags, allowing deep transcriptome analysis. However, a bias in the complexity of the transcriptome representation obtained by MPSS was recently uncovered. Results In order to make a deep analysis of mouse hypothalamus transcriptome avoiding the limitation introduced by MPSS, we combined LongSAGE with the Solexa sequencing technology and obtained a library of more than 11 millions of tags. We then compared it to a LongSAGE library of mouse hypothalamus sequenced with the Sanger method. Conclusion We found that Solexa sequencing technology combined with LongSAGE is perfectly suited for deep transcriptome analysis. In contrast to MPSS, it gives a complex representation of transcriptome as reliable as a LongSAGE library sequenced by the Sanger method. PMID:18796152
NASA Astrophysics Data System (ADS)
He, Shixuan; Xie, Wanyi; Zhang, Wei; Zhang, Liqun; Wang, Yunxia; Liu, Xiaoling; Liu, Yulong; Du, Chunlei
2015-02-01
A novel strategy which combines iteratively cubic spline fitting baseline correction method with discriminant partial least squares qualitative analysis is employed to analyze the surface enhanced Raman scattering (SERS) spectroscopy of banned food additives, such as Sudan I dye and Rhodamine B in food, Malachite green residues in aquaculture fish. Multivariate qualitative analysis methods, using the combination of spectra preprocessing iteratively cubic spline fitting (ICSF) baseline correction with principal component analysis (PCA) and discriminant partial least squares (DPLS) classification respectively, are applied to investigate the effectiveness of SERS spectroscopy for predicting the class assignments of unknown banned food additives. PCA cannot be used to predict the class assignments of unknown samples. However, the DPLS classification can discriminate the class assignment of unknown banned additives using the information of differences in relative intensities. The results demonstrate that SERS spectroscopy combined with ICSF baseline correction method and exploratory analysis methodology DPLS classification can be potentially used for distinguishing the banned food additives in field of food safety.
ERIC Educational Resources Information Center
Eckes, Thomas
2017-01-01
This paper presents an approach to standard setting that combines the prototype group method (PGM; Eckes, 2012) with a receiver operating characteristic (ROC) analysis. The combined PGM-ROC approach is applied to setting cut scores on a placement test of English as a foreign language (EFL). To implement the PGM, experts first named learners whom…
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Regularized Generalized Canonical Correlation Analysis
ERIC Educational Resources Information Center
Tenenhaus, Arthur; Tenenhaus, Michel
2011-01-01
Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…
Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method
NASA Astrophysics Data System (ADS)
De Waal, Sybrand A.
1996-07-01
A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.
Experimental design and statistical analysis for three-drug combination studies.
Fang, Hong-Bin; Chen, Xuerong; Pei, Xin-Yan; Grant, Steven; Tan, Ming
2017-06-01
Drug combination is a critically important therapeutic approach for complex diseases such as cancer and HIV due to its potential for efficacy at lower, less toxic doses and the need to move new therapies rapidly into clinical trials. One of the key issues is to identify which combinations are additive, synergistic, or antagonistic. While the value of multidrug combinations has been well recognized in the cancer research community, to our best knowledge, all existing experimental studies rely on fixing the dose of one drug to reduce the dimensionality, e.g. looking at pairwise two-drug combinations, a suboptimal design. Hence, there is an urgent need to develop experimental design and analysis methods for studying multidrug combinations directly. Because the complexity of the problem increases exponentially with the number of constituent drugs, there has been little progress in the development of methods for the design and analysis of high-dimensional drug combinations. In fact, contrary to common mathematical reasoning, the case of three-drug combinations is fundamentally more difficult than two-drug combinations. Apparently, finding doses of the combination, number of combinations, and replicates needed to detect departures from additivity depends on dose-response shapes of individual constituent drugs. Thus, different classes of drugs of different dose-response shapes need to be treated as a separate case. Our application and case studies develop dose finding and sample size method for detecting departures from additivity with several common (linear and log-linear) classes of single dose-response curves. Furthermore, utilizing the geometric features of the interaction index, we propose a nonparametric model to estimate the interaction index surface by B-spine approximation and derive its asymptotic properties. Utilizing the method, we designed and analyzed a combination study of three anticancer drugs, PD184, HA14-1, and CEP3891 inhibiting myeloma H929 cell line. To our best knowledge, this is the first ever three drug combinations study performed based on the original 4D dose-response surface formed by dose ranges of three drugs.
Time Transfer from Combined Analysis of GPS and TWSTFT Data
2008-12-01
40th Annual Precise Time and Time Interval (PTTI) Meeting 565 TIME TRANSFER FROM COMBINED ANALYSIS OF GPS AND TWSTFT DATA...bipm.org Abstract This paper presents the time transfer results obtained from the combination of GPS data and TWSTFT data. Two different methods...view, constrained by TWSTFT data. Using the Vondrak-Cepek algorithm, the second approach (named PPP+TW) combines the TWSTFT time transfer data with
Uchikoga, Nobuyuki; Hirokawa, Takatsugu
2010-05-11
Protein-protein docking for proteins with large conformational changes was analyzed by using interaction fingerprints, one of the scales for measuring similarities among complex structures, utilized especially for searching near-native protein-ligand or protein-protein complex structures. Here, we have proposed a combined method for analyzing protein-protein docking by taking large conformational changes into consideration. This combined method consists of ensemble soft docking with multiple protein structures, refinement of complexes, and cluster analysis using interaction fingerprints and energy profiles. To test for the applicability of this combined method, various CaM-ligand complexes were reconstructed from the NMR structures of unbound CaM. For the purpose of reconstruction, we used three known CaM-ligands, namely, the CaM-binding peptides of cyclic nucleotide gateway (CNG), CaM kinase kinase (CaMKK) and the plasma membrane Ca2+ ATPase pump (PMCA), and thirty-one structurally diverse CaM conformations. For each ligand, 62000 CaM-ligand complexes were generated in the docking step and the relationship between their energy profiles and structural similarities to the native complex were analyzed using interaction fingerprint and RMSD. Near-native clusters were obtained in the case of CNG and CaMKK. The interaction fingerprint method discriminated near-native structures better than the RMSD method in cluster analysis. We showed that a combined method that includes the interaction fingerprint is very useful for protein-protein docking analysis of certain cases.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Li, Tao; Su, Chen
2018-06-02
Rhodiola is an increasingly widely used traditional Tibetan medicine and traditional Chinese medicine in China. The composition profiles of bioactive compounds are somewhat jagged according to different species, which makes it crucial to identify authentic Rhodiola species accurately so as to ensure clinical application of Rhodiola. In this paper, a nondestructive, rapid, and efficient method in classification of Rhodiola was developed by Fourier transform near-infrared (FT-NIR) spectroscopy combined with chemometrics analysis. A total of 160 batches of raw spectra were obtained from four different species of Rhodiola by FT-NIR, such as Rhodiola crenulata, Rhodiola fastigiata, Rhodiola kirilowii, and Rhodiola brevipetiolata. After excluding the outliers, different performances of 3 sample dividing methods, 12 spectral preprocessing methods, 2 wavelength selection methods, and 2 modeling evaluation methods were compared. The results indicated that this combination was superior than others in the authenticity identification analysis, which was FT-NIR combined with sample set partitioning based on joint x-y distances (SPXY), standard normal variate transformation (SNV) + Norris-Williams (NW) + 2nd derivative, competitive adaptive reweighted sampling (CARS), and kernel extreme learning machine (KELM). The accuracy (ACCU), sensitivity (SENS), and specificity (SPEC) of the optimal model were all 1, which showed that this combination of FT-NIR and chemometrics methods had the optimal authenticity identification performance. The classification performance of the partial least squares discriminant analysis (PLS-DA) model was slightly lower than KELM model, and PLS-DA model results were ACCU = 0.97, SENS = 0.93, and SPEC = 0.98, respectively. It can be concluded that FT-NIR combined with chemometrics analysis has great potential in authenticity identification and classification of Rhodiola, which can provide a valuable reference for the safety and effectiveness of clinical application of Rhodiola. Copyright © 2018 Elsevier B.V. All rights reserved.
Robust inference for group sequential trials.
Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei
2017-03-01
For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Adams, Gaynor J; DUGAN DUANE W
1952-01-01
A method of analysis based on slender-wing theory is developed to investigate the characteristics in roll of slender cruciform wings and wing-body combinations. The method makes use of the conformal mapping processes of classical hydrodynamics which transform the region outside a circle and the region outside an arbitrary arrangement of line segments intersecting at the origin. The method of analysis may be utilized to solve other slender cruciform wing-body problems involving arbitrarily assigned boundary conditions. (author)
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
NASA Astrophysics Data System (ADS)
Qin, Le; Xie, HuiMin; Zhu, RongHua; Wu, Dan; Che, ZhiGang; Zou, ShiKun
2014-04-01
This paper investigates the effect of the location of testing area in residual stress measurement by Moiré interferometry combined with hole-drilling method. The selection of the location of the testing area is analyzed from theory and experiment. In the theoretical study, the factors which affect the surface released radial strain ɛ r were analyzed on the basis of the formulae of the hole-drilling method, and the relations between those factors and ɛ r were established. By combining Moiré interferometry with the hole-drilling method, the residual stress of interference-fit specimen was measured to verify the theoretical analysis. According to the analysis results, the testing area for minimizing the error of strain measurement is determined. Moreover, if the orientation of the maximum principal stress is known, the value of strain will be measured with higher precision by the Moiré interferometry method.
Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces.
Abu-Alqumsan, Mohammad; Peer, Angelika
2016-06-01
Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.
Advanced microscopic methods for the detection of adhesion barriers in immunology in medical imaging
NASA Astrophysics Data System (ADS)
Lawrence, Shane
2017-07-01
Advanced methods of microscopy and advanced techniques of analysis stemming therefrom have developed greatly in the past few years.The use of single discrete methods has given way to the combination of methods which means an increase in data for processing to progress to the analysis and diagnosis of ailments and diseases which can be viewed by each and any method.This presentation shows the combination of such methods and gives example of the data which arises from each individual method and the combined methodology and suggests how such data can be streamlined to enable conclusions to be drawn about the particular biological and biochemical considerations that arise.In this particular project the subject of the methodology was human lactoferrin and the relation of the adhesion properties of hlf in the overcoming of barriers to adhesion mainly on the perimeter of the cellular unit and how this affects the process of immunity in any particular case.
He, Shixuan; Xie, Wanyi; Zhang, Wei; Zhang, Liqun; Wang, Yunxia; Liu, Xiaoling; Liu, Yulong; Du, Chunlei
2015-02-25
A novel strategy which combines iteratively cubic spline fitting baseline correction method with discriminant partial least squares qualitative analysis is employed to analyze the surface enhanced Raman scattering (SERS) spectroscopy of banned food additives, such as Sudan I dye and Rhodamine B in food, Malachite green residues in aquaculture fish. Multivariate qualitative analysis methods, using the combination of spectra preprocessing iteratively cubic spline fitting (ICSF) baseline correction with principal component analysis (PCA) and discriminant partial least squares (DPLS) classification respectively, are applied to investigate the effectiveness of SERS spectroscopy for predicting the class assignments of unknown banned food additives. PCA cannot be used to predict the class assignments of unknown samples. However, the DPLS classification can discriminate the class assignment of unknown banned additives using the information of differences in relative intensities. The results demonstrate that SERS spectroscopy combined with ICSF baseline correction method and exploratory analysis methodology DPLS classification can be potentially used for distinguishing the banned food additives in field of food safety. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Reddy, C. J.; Deshpande, Manohar D.; Cockrell, C. R.; Beck, F. B.
1995-01-01
A combined finite element method/method of moments (FEM/MoM) approach is used to analyze the electromagnetic scattering properties of a three-dimensional-cavity-backed aperture in an infinite ground plane. The FEM is used to formulate the fields inside the cavity, and the MoM (with subdomain bases) in both spectral and spatial domains is used to formulate the fields above the ground plane. Fields in the aperture and the cavity are solved using a system of equations resulting from the combination of the FEM and the MoM. By virtue of the FEM, this combined approach is applicable to all arbitrarily shaped cavities with inhomogeneous material fillings, and because of the subdomain bases used in the MoM, the apertures can be of any arbitrary shape. This approach leads to a partly sparse and partly full symmetric matrix, which is efficiently solved using a biconjugate gradient algorithm. Numerical results are presented to validate the analysis.
Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Takane, Yoshio
2004-01-01
We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…
The colorimetric analysis of anti-tuberculosis fixed-dose combination tablets and capsules.
Ellard, G A
1999-11-01
The perceived need to demonstrate whether or not the actual amounts of rifampicin, isoniazid and pyrazinamide in fixed-dose combination tablets or capsules correspond to their stated drug contents. To adapt specific, robust and simple colorimetric methods that have been previously applied to measuring plasma and urinary rifampicin, isoniazid, pyrazinamide and ethambutol concentrations to estimate tablet and capsule drug contents. The methods were applied to the analysis of 14 commercially manufactured fixed-dose combinations: two capsule and three tablet formulations containing rifampicin and isoniazid; seven tablet formulations containing rifampicin, isoniazid and pyrazinamide; and two tablet formulations containing rifampicin, isoniazid, pyrazinamide and ethambutol. All the combined formulations contained near to their stated drug contents. Replicate analyses confirmed the excellent precision of the drug analyses. Such methods are not only rapid to perform but should be practical in many Third World situations with relatively modest laboratory facilities.
NASA Astrophysics Data System (ADS)
Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle
2018-05-01
Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.
Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2008-01-01
Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…
Method for combined biometric and chemical analysis of human fingerprints.
Staymates, Jessica L; Orandi, Shahram; Staymates, Matthew E; Gillen, Greg
This paper describes a method for combining direct chemical analysis of latent fingerprints with subsequent biometric analysis within a single sample. The method described here uses ion mobility spectrometry (IMS) as a chemical detection method for explosives and narcotics trace contamination. A collection swab coated with a high-temperature adhesive has been developed to lift latent fingerprints from various surfaces. The swab is then directly inserted into an IMS instrument for a quick chemical analysis. After the IMS analysis, the lifted print remains intact for subsequent biometric scanning and analysis using matching algorithms. Several samples of explosive-laden fingerprints were successfully lifted and the explosives detected with IMS. Following explosive detection, the lifted fingerprints remained of sufficient quality for positive match scores using a prepared gallery consisting of 60 fingerprints. Based on our results ( n = 1200), there was no significant decrease in the quality of the lifted print post IMS analysis. In fact, for a small subset of lifted prints, the quality was improved after IMS analysis. The described method can be readily applied to domestic criminal investigations, transportation security, terrorist and bombing threats, and military in-theatre settings.
A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Wang, Wanping; Liu, Mingyue; Wang, Jing; Tian, Rui; Dong, Junqiang; Liu, Qi; Zhao, Xianping; Wang, Yuanfang
2014-01-01
Screening indexes of tumor serum markers for benign and malignant solitary pulmonary nodules (SPNs) were analyzed to find the optimum method for diagnosis. Enzyme-linked immunosorbent assays, an automatic immune analyzer and radioimmunoassay methods were used to examine the levels of 8 serum markers in 164 SPN patients, and the sensitivity for differential diagnosis of malignant or benign SPN was compared for detection using a single plasma marker or a combination of markers. The results for serological indicators that closely relate to benign and malignant SPNs were screened using the Fisher discriminant analysis and a non-conditional logistic regression analysis method, respectively. The results were then verified by the k-means clustering analysis method. The sensitivity when using a combination of serum markers to detect SPN was higher than that using a single marker. By Fisher discriminant analysis, cytokeratin 19 fragments (CYFRA21-1), carbohydrate antigen 125 (CA125), squamous cell carcinoma antigen (SCC) and breast cancer antigen (CA153), which relate to the benign and malignant SPNs, were screened. Through non-conditional logistic regression analysis, CYFRA21-1, SCC and CA153 were obtained. Using the k-means clustering analysis, the cophenetic correlation coefficient (0.940) obtained by the Fisher discriminant analysis was higher than that obtained with logistic regression analysis (0.875). This study indicated that the Fisher discriminant analysis functioned better in screening out serum markers to recognize the benign and malignant SPN. The combined detection of CYFRA21-1, CA125, SCC and CA153 is an effective way to distinguish benign and malignant SPN, and will find an important clinical application in the early diagnosis of SPN. © 2014 S. Karger GmbH, Freiburg.
Microalgae Compositional Analysis Laboratory Procedures | Bioenergy | NREL
these methods build on years of research in algal biomass analysis. By combining the appropriate LAPs and Ash in Algal Biomass This procedure describes the methods used to determine the amount of moisture
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
Ménard, Richard; Deshaies-Jacques, Martin; Gasset, Nicolas
2016-09-01
An objective analysis is one of the main components of data assimilation. By combining observations with the output of a predictive model we combine the best features of each source of information: the complete spatial and temporal coverage provided by models, with a close representation of the truth provided by observations. The process of combining observations with a model output is called an analysis. To produce an analysis requires the knowledge of observation and model errors, as well as its spatial correlation. This paper is devoted to the development of methods of estimation of these error variances and the characteristic length-scale of the model error correlation for its operational use in the Canadian objective analysis system. We first argue in favor of using compact support correlation functions, and then introduce three estimation methods: the Hollingsworth-Lönnberg (HL) method in local and global form, the maximum likelihood method (ML), and the [Formula: see text] diagnostic method. We perform one-dimensional (1D) simulation studies where the error variance and true correlation length are known, and perform an estimation of both error variances and correlation length where both are non-uniform. We show that a local version of the HL method can capture accurately the error variances and correlation length at each observation site, provided that spatial variability is not too strong. However, the operational objective analysis requires only a single and globally valid correlation length. We examine whether any statistics of the local HL correlation lengths could be a useful estimate, or whether other global estimation methods such as by the global HL, ML, or [Formula: see text] should be used. We found in both 1D simulation and using real data that the ML method is able to capture physically significant aspects of the correlation length, while most other estimates give unphysical and larger length-scale values. This paper describes a proposed improvement of the objective analysis of surface pollutants at Environment and Climate Change Canada (formerly known as Environment Canada). Objective analyses are essentially surface maps of air pollutants that are obtained by combining observations with an air quality model output, and are thought to provide a complete and more accurate representation of the air quality. The highlight of this study is an analysis of methods to estimate the model (or background) error correlation length-scale. The error statistics are an important and critical component to the analysis scheme.
Cautions on the Use of Investigative Case Studies in Meta-Evaluation.
ERIC Educational Resources Information Center
Smith, Nick L.
1990-01-01
A meta-analysis combining expert evaluation with naturalistic case study methods indicates that such investigations must use special methods to render evaluative judgments of worth. It is demonstrated that descriptive, interpretive, and evaluative aspects of such a study must be combined to yield justifiable conclusions. (TJH)
USDA-ARS?s Scientific Manuscript database
New, faster methods have been developed for analysis of vitamin D and triacylglycerols that eliminate hours of wet chemistry and preparative chromatography, while providing more information than classical methods for analysis. Unprecedented detail is provided by combining liquid chromatography with ...
NASA Astrophysics Data System (ADS)
Yoshida, Takashi
Combined-levitation-and-propulsion single-sided linear induction motor (SLIM) vehicle can be levitated without any additional levitation system. When the vehicle runs, the attractive-normal force varies depending on the phase of primary current because of the short primary end effect. The ripple of the attractive-normal force causes the vertical vibration of the vehicle. In this paper, instantaneous attractive-normal force is analyzed by using space harmonic analysis method. And based on the analysis, vertical vibration control is proposed. The validity of the proposed control method is verified by numerical simulation.
2013-01-01
Background As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. Results We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS A : DE genes with non-zero effect sizes in all studies, (2) HS B : DE genes with non-zero effect sizes in one or more studies and (3) HS r : DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. Conclusions The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS A , HS B , and HS r ). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author’s publication website. PMID:24359104
Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C
2013-12-21
As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.
Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.
Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana
2017-07-01
Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.
Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.
Renz, Susan M; Carrington, Jane M; Badger, Terry A
2018-04-01
The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.
Quad-Tree Visual-Calculus Analysis of Satellite Coverage
NASA Technical Reports Server (NTRS)
Lo, Martin W.; Hockney, George; Kwan, Bruce
2003-01-01
An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.
Tran, Liem T; Knight, C Gregory; O'Neill, Robert V; Smith, Elizabeth R; Riitters, Kurt H; Wickham, James
2002-06-01
A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams, air pollution, and topography of the Mid-Atlantic region, we were able to point out areas that were in relatively poor condition and/or vulnerable to future deterioration. The method offered an easy and comprehensive way to combine the strengths of fuzzy set theory and the AHP for ecological assessment. Furthermore, the suggested method can serve as a building block for the evaluation of environmental policies.
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds
Deeks, J.J.; Martin, E.C.; Riley, R.D.
2017-01-01
Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347
NASA Technical Reports Server (NTRS)
Phillips, J. R.
1996-01-01
In this paper we derive error bounds for a collocation-grid-projection scheme tuned for use in multilevel methods for solving boundary-element discretizations of potential integral equations. The grid-projection scheme is then combined with a precorrected FFT style multilevel method for solving potential integral equations with 1/r and e(sup ikr)/r kernels. A complexity analysis of this combined method is given to show that for homogeneous problems, the method is order n natural log n nearly independent of the kernel. In addition, it is shown analytically and experimentally that for an inhomogeneity generated by a very finely discretized surface, the combined method slows to order n(sup 4/3). Finally, examples are given to show that the collocation-based grid-projection plus precorrected-FFT scheme is competitive with fast-multipole algorithms when considering realistic problems and 1/r kernels, but can be used over a range of spatial frequencies with only a small performance penalty.
Midulla, Marco; Moreno, Ramiro; Baali, Adil; Chau, Ming; Negre-Salvayre, Anne; Nicoud, Franck; Pruvo, Jean-Pierre; Haulon, Stephan; Rousseau, Hervé
2012-10-01
In the last decade, there was been increasing interest in finding imaging techniques able to provide a functional vascular imaging of the thoracic aorta. The purpose of this paper is to present an imaging method combining magnetic resonance imaging (MRI) and computational fluid dynamics (CFD) to obtain a patient-specific haemodynamic analysis of patients treated by thoracic endovascular aortic repair (TEVAR). MRI was used to obtain boundary conditions. MR angiography (MRA) was followed by cardiac-gated cine sequences which covered the whole thoracic aorta. Phase contrast imaging provided the inlet and outlet profiles. A CFD mesh generator was used to model the arterial morphology, and wall movements were imposed according to the cine imaging. CFD runs were processed using the finite volume (FV) method assuming blood as a homogeneous Newtonian fluid. Twenty patients (14 men; mean age 62.2 years) with different aortic lesions were evaluated. Four-dimensional mapping of velocity and wall shear stress were obtained, depicting different patterns of flow (laminar, turbulent, stenosis-like) and local alterations of parietal stress in-stent and along the native aorta. A computational method using a combined approach with MRI appears feasible and seems promising to provide detailed functional analysis of thoracic aorta after stent-graft implantation. • Functional vascular imaging of the thoracic aorta offers new diagnostic opportunities • CFD can model vascular haemodynamics for clinical aortic problems • Combining CFD with MRI offers patient specific method of aortic analysis • Haemodynamic analysis of stent-grafts could improve clinical management and follow-up.
Challenges in combining different data sets during analysis when using grounded theory.
Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi
2014-05-01
To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.
NASA Astrophysics Data System (ADS)
Yusoh, R.; Saad, R.; Saidin, M.; Muhammad, S. B.; Anda, S. T.
2018-04-01
Both electrical resistivity and seismic refraction profiling has become a common method in pre-investigations for visualizing subsurface structure. The encouragement to use these methods is that combined of both methods can decrease the obscure inherent to the distinctive use of these methods. Both method have their individual software packages for data inversion, but potential to combine certain geophysical methods are restricted; however, the research algorithms that have this functionality was exists and are evaluated personally. The interpretation of subsurface were improve by combining inversion data from both method by influence each other models using closure coupling; thus, by implementing both methods to support each other which could improve the subsurface interpretation. These methods were applied on a field dataset from a pre-investigation for archeology in finding the material deposits of impact crater. There were no major changes in the inverted model by combining data inversion for this archetype which probably due to complex geology. The combine data analysis shows the deposit material start from ground surface to 20 meter depth which the class separation clearly separate the deposit material.
NASA Astrophysics Data System (ADS)
He, Shixuan; Xie, Wanyi; Zhang, Ping; Fang, Shaoxi; Li, Zhe; Tang, Peng; Gao, Xia; Guo, Jinsong; Tlili, Chaker; Wang, Deqiang
2018-02-01
The analysis of algae and dominant alga plays important roles in ecological and environmental fields since it can be used to forecast water bloom and control its potential deleterious effects. Herein, we combine in vivo confocal resonance Raman spectroscopy with multivariate analysis methods to preliminary identify the three algal genera in water blooms at unicellular scale. Statistical analysis of characteristic Raman peaks demonstrates that certain shifts and different normalized intensities, resulting from composition of different carotenoids, exist in Raman spectra of three algal cells. Principal component analysis (PCA) scores and corresponding loading weights show some differences from Raman spectral characteristics which are caused by vibrations of carotenoids in unicellular algae. Then, discriminant partial least squares (DPLS) classification method is used to verify the effectiveness of algal identification with confocal resonance Raman spectroscopy. Our results show that confocal resonance Raman spectroscopy combined with PCA and DPLS could handle the preliminary identification of dominant alga for forecasting and controlling of water blooms.
Improvement of the Earth's gravity field from terrestrial and satellite data
NASA Technical Reports Server (NTRS)
Rapp, Richard H.
1992-01-01
The determination of the Earth's gravitational potential can be done through the analysis of satellite perturbations, the analysis of surface gravity data, or both. The combination of the two data types yields a solution that combines the strength of each method: the longer wavelength strength in the satellite analysis with the better high frequency information from surface gravity data. Since 1972, Ohio State has carried out activities that have provided surface gravity data to a number of organizations who have developed combination potential coefficient models that describe the Earth's gravitational potential.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Gerwin, Philip M; Arbona, Rodolfo J Ricart; Riedel, Elyn R; Lepherd, Michelle L; Henderson, Ken S; Lipman, Neil S
2017-01-01
There is no consensus regarding the best practice for detecting murine pinworm infections. Initially, we evaluated 7 fecal concentration methods by using feces containing Aspiculuris tetraptera (AT) eggs (n = 20 samples per method). Sodium nitrate flotation, sodium nitrate centrifugation, Sheather sugar centrifugation, and zinc sulfate centrifugation detected eggs in 100% of samples; zinc sulfate flotation and water sedimentation detected eggs in 90%. All had better detection rates than Sheather sugar flotation (50%). To determine optimal detection methods, Swiss Webster mice were exposed to Syphacia obvelata (SO; n = 60) or AT (n = 60). We compared the following methods at days 0, 30, and 90, beginning 21 or 28 d after SO and AT exposure, respectively: fecal concentration (AT only), anal tape test (SO only), direct examination of intestinal contents (cecum and colon), Swiss roll histology (cecum and colon), and PCR analysis (pooled fur swab and feces). Detection rates for SO-exposed mice were: PCR analysis, 45%; Swiss roll histology, 30%; intestinal content exam, 27%; and tape test, 27%. The SO detection rate for PCR analysis was significantly greater than that for the tape test. Detection rates for AT-exposed mice were: intestinal content exam, 53%; PCR analysis, 33%; fecal flotation, 22%; and Swiss roll histology, 17%. The AT detection rate of PCR analysis combined with intestinal content examination was greater than for PCR analysis only and the AT detection rate of intestinal content examination was greater than for Swiss roll histology. Combining PCR analysis with intestinal content examination detected 100% of infected animals. No single test detected all positive animals. We recommend combining PCR analysis with intestinal content examination for optimal pinworm detection. PMID:28905712
A Review of Classical Methods of Item Analysis.
ERIC Educational Resources Information Center
French, Christine L.
Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…
Ren, Jingzheng
2018-01-01
This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Liang, Xianrui; Ma, Meiling; Su, Weike
2013-01-01
Background: A method for chemical fingerprint analysis of Hibiscus mutabilis L. leaves was developed based on ultra performance liquid chromatography with photodiode array detector (UPLC-PAD) combined with similarity analysis (SA) and hierarchical clustering analysis (HCA). Materials and Methods: 10 batches of Hibiscus mutabilis L. leaves samples were collected from different regions of China. UPLC-PAD was employed to collect chemical fingerprints of Hibiscus mutabilis L. leaves. Results: The relative standard deviations (RSDs) of the relative retention times (RRT) and relative peak areas (RPA) of 10 characteristic peaks (one of them was identified as rutin) in precision, repeatability and stability test were less than 3%, and the method of fingerprint analysis was validated to be suitable for the Hibiscus mutabilis L. leaves. Conclusions: The chromatographic fingerprints showed abundant diversity of chemical constituents qualitatively in the 10 batches of Hibiscus mutabilis L. leaves samples from different locations by similarity analysis on basis of calculating the correlation coefficients between each two fingerprints. Moreover, the HCA method clustered the samples into four classes, and the HCA dendrogram showed the close or distant relations among the 10 samples, which was consistent to the SA result to some extent. PMID:23930008
Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces
NASA Astrophysics Data System (ADS)
Abu-Alqumsan, Mohammad; Peer, Angelika
2016-06-01
Objective. Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. Approach. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. Main results. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Significance. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.
Optimization of bone drilling parameters using Taguchi method based on finite element analysis
NASA Astrophysics Data System (ADS)
Rosidi, Ayip; Lenggo Ginta, Turnad; Rani, Ahmad Majdi Bin Abdul
2017-05-01
Thermal necrosis results fracture problems and implant failure if temperature exceeds 47 °C for one minute during bone drilling. To solve this problem, this work studied a new thermal model by using three drilling parameters: drill diameter, feed rate and spindle speed. Effects of those parameters to heat generation were studied. The drill diameters were 4 mm, 6 mm and 6 mm; the feed rates were 80 mm/min, 100 mm/min and 120 mm/min whereas the spindle speeds were 400 rpm, 500 rpm and 600 rpm then an optimization was done by Taguchi method to which combination parameter can be used to prevent thermal necrosis during bone drilling. The results showed that all the combination of parameters produce confidence results which were below 47 °C and finite element analysis combined with Taguchi method can be used for predicting temperature generation and optimizing bone drilling parameters prior to clinical bone drilling. All of the combination parameters can be used for surgeon to achieve sustainable orthopaedic surgery.
NASA Astrophysics Data System (ADS)
Wada, Kodai; Tomita, Koji; Takashiri, Masayuki
2018-06-01
The thermoelectric properties of bismuth telluride (Bi2Te3) nanoplate thin films were estimated using combined infrared spectroscopy and first-principles calculation, followed by comparing the estimated properties with those obtained using the standard electrical probing method. Hexagonal single-crystalline Bi2Te3 nanoplates were first prepared using solvothermal synthesis, followed by preparing Bi2Te3 nanoplate thin films using the drop-casting technique. The nanoplates were joined by thermally annealing them at 250 °C in Ar (95%)–H2 (5%) gas (atmospheric pressure). The electronic transport properties were estimated by infrared spectroscopy using the Drude model, with the effective mass being determined from the band structure using first-principles calculations based on the density functional theory. The electrical conductivity and Seebeck coefficient obtained using the combined analysis were higher than those obtained using the standard electrical probing method, probably because the contact resistance between the nanoplates was excluded from the estimation procedure of the combined analysis method.
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
Wang, Jinjia; Zhang, Yanna
2015-02-01
Brain-computer interface (BCI) systems identify brain signals through extracting features from them. In view of the limitations of the autoregressive model feature extraction method and the traditional principal component analysis to deal with the multichannel signals, this paper presents a multichannel feature extraction method that multivariate autoregressive (MVAR) model combined with the multiple-linear principal component analysis (MPCA), and used for magnetoencephalography (MEG) signals and electroencephalograph (EEG) signals recognition. Firstly, we calculated the MVAR model coefficient matrix of the MEG/EEG signals using this method, and then reduced the dimensions to a lower one, using MPCA. Finally, we recognized brain signals by Bayes Classifier. The key innovation we introduced in our investigation showed that we extended the traditional single-channel feature extraction method to the case of multi-channel one. We then carried out the experiments using the data groups of IV-III and IV - I. The experimental results proved that the method proposed in this paper was feasible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gill, M.; Vallada, H.; Collier, D.
1996-02-16
Several groups have reported weak evidence for linkage between schizophrenia and genetic markers located on chromosome 22q using the lod score method of analysis. However these findings involved different genetic markers and methods of analysis, and so were not directly comparable. To resolve this issue we have performed a combined analysis of genotypic data from the marker D22S278 in multiply affected schizophrenic families derived from 11 independent research groups worldwide. This marker was chosen because it showed maximum evidence for linkage in three independent datasets. Using the affected sib-pair method as implemented by the program ESPA, the combined dataset showedmore » 252 alleles shared compared with 188 alleles not shared (chi-square 9.31, 1df, P = 0.001) where parental genotype data was completely known. When sib-pairs for whom parental data was assigned according to probability were included the number of alleles shared was 514.1 compared with 437.8 not shared (chi-square 6.12, 1df, P = 0.006). Similar results were obtained when a likelihood ratio method for sib-pair analysis was used. These results indicate that there may be a susceptibility locus for schizophrenia at 22q12. 27 refs., 3 tabs.« less
Hohmann, Monika; Monakhova, Yulia; Erich, Sarah; Christoph, Norbert; Wachter, Helmut; Holzgrabe, Ulrike
2015-11-04
Because the basic suitability of proton nuclear magnetic resonance spectroscopy ((1)H NMR) to differentiate organic versus conventional tomatoes was recently proven, the approach to optimize (1)H NMR classification models (comprising overall 205 authentic tomato samples) by including additional data of isotope ratio mass spectrometry (IRMS, δ(13)C, δ(15)N, and δ(18)O) and mid-infrared (MIR) spectroscopy was assessed. Both individual and combined analytical methods ((1)H NMR + MIR, (1)H NMR + IRMS, MIR + IRMS, and (1)H NMR + MIR + IRMS) were examined using principal component analysis (PCA), partial least squares discriminant analysis (PLS-DA), linear discriminant analysis (LDA), and common components and specific weight analysis (ComDim). With regard to classification abilities, fused data of (1)H NMR + MIR + IRMS yielded better validation results (ranging between 95.0 and 100.0%) than individual methods ((1)H NMR, 91.3-100%; MIR, 75.6-91.7%), suggesting that the combined examination of analytical profiles enhances authentication of organically produced tomatoes.
NASA Astrophysics Data System (ADS)
Kaysheva, A. L.; Pleshakova, T. O.; Kopylov, A. T.; Shumov, I. D.; Iourov, I. Y.; Vorsanova, S. G.; Yurov, Y. B.; Ziborov, V. S.; Archakov, A. I.; Ivanov, Y. D.
2017-10-01
Possibility of detection of target proteins associated with development of autistic disorders in children with use of combined atomic force microscopy and mass spectrometry (AFM/MS) method is demonstrated. The proposed method is based on the combination of affine enrichment of proteins from biological samples and visualization of these proteins by AFM and MS analysis with quantitative detection of target proteins.
INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS
A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...
Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang
2018-09-01
Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.
Liem T. Tran; C. Gregory Knight; Robert V. O' Neill; Elizabeth R. Smith; Kurt H. Riitters; James D. Wickham
2002-01-01
A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams,...
A Mixed Methods Content Analysis of the Research Literature in Science Education
ERIC Educational Resources Information Center
Schram, Asta B.
2014-01-01
In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…
Comparative study of signalling methods for high-speed backplane transceiver
NASA Astrophysics Data System (ADS)
Wu, Kejun
2017-11-01
A combined analysis of transient simulation and statistical method is proposed for comparative study of signalling methods applied to high-speed backplane transceivers. This method enables fast and accurate signal-to-noise ratio and symbol error rate estimation of a serial link based on a four-dimension design space, including channel characteristics, noise scenarios, equalisation schemes, and signalling methods. The proposed combined analysis method chooses an efficient sampling size for performance evaluation. A comparative study of non-return-to-zero (NRZ), PAM-4, and four-phase shifted sinusoid symbol (PSS-4) using parameterised behaviour-level simulation shows PAM-4 and PSS-4 has substantial advantages over conventional NRZ in most of the cases. A comparison between PAM-4 and PSS-4 shows PAM-4 gets significant bit error rate degradation when noise level is enhanced.
Combined proportional and additive residual error models in population pharmacokinetic modelling.
Proost, Johannes H
2017-11-15
In pharmacokinetic modelling, a combined proportional and additive residual error model is often preferred over a proportional or additive residual error model. Different approaches have been proposed, but a comparison between approaches is still lacking. The theoretical background of the methods is described. Method VAR assumes that the variance of the residual error is the sum of the statistically independent proportional and additive components; this method can be coded in three ways. Method SD assumes that the standard deviation of the residual error is the sum of the proportional and additive components. Using datasets from literature and simulations based on these datasets, the methods are compared using NONMEM. The different coding of methods VAR yield identical results. Using method SD, the values of the parameters describing residual error are lower than for method VAR, but the values of the structural parameters and their inter-individual variability are hardly affected by the choice of the method. Both methods are valid approaches in combined proportional and additive residual error modelling, and selection may be based on OFV. When the result of an analysis is used for simulation purposes, it is essential that the simulation tool uses the same method as used during analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
A peaking-regulation-balance-based method for wind & PV power integrated accommodation
NASA Astrophysics Data System (ADS)
Zhang, Jinfang; Li, Nan; Liu, Jun
2018-02-01
Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.
Hercegová, Andrea; Dömötörová, Milena; Kruzlicová, Dása; Matisová, Eva
2006-05-01
Four sample preparation techniques were compared for the ultratrace analysis of pesticide residues in baby food: (a) modified Schenck's method based on ACN extraction with SPE cleaning; (b) quick, easy, cheap, effective, rugged, and safe (QuEChERS) method based on ACN extraction and dispersive SPE; (c) modified QuEChERS method which utilizes column-based SPE instead of dispersive SPE; and (d) matrix solid phase dispersion (MSPD). The methods were combined with fast gas chromatographic-mass spectrometric analysis. The effectiveness of clean-up of the final extract was determined by comparison of the chromatograms obtained. Time consumption, laboriousness, demands on glassware and working place, and consumption of chemicals, especially solvents, increase in the following order QuEChERS < modified QuEChERS < MSPD < modified Schenck's method. All methods offer satisfactory analytical characteristics at the concentration levels of 5, 10, and 100 microg/kg in terms of recoveries and repeatability. Recoveries obtained for the modified QuEChERS method were lower than for the original QuEChERS. In general the best LOQs were obtained for the modified Schenck's method. Modified QuEChERS method provides 21-72% better LOQs than the original method.
NASA Technical Reports Server (NTRS)
Heldenfels, Richard R
1951-01-01
A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.
High-pressure liquid chromatography analysis of antibiotic susceptibility disks.
Hagel, R B; Waysek, E H; Cort, W M
1979-01-01
The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793
A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence
NASA Technical Reports Server (NTRS)
Sidwell, K.
1976-01-01
An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.
NASA Astrophysics Data System (ADS)
Kislov, E. V.; Kulikov, A. A.; Kulikova, A. B.
1989-10-01
Samples of basit-ultrabasit rocks and NiCu ores of the Ioko-Dovyren and Chaya massifs were analysed by SRXFA and a chemical-spectral method. SRXFA perfectly satisfies the quantitative noble-metals analysis of ore-free rocks. Combination of SRXFA and chemical-spectral analysis has good prospects. After analysis of a great number of samples by SRXFA it is necessary to select samples which would show minimal and maximal results for the chemical-spectral method.
Face recognition using slow feature analysis and contourlet transform
NASA Astrophysics Data System (ADS)
Wang, Yuehao; Peng, Lingling; Zhe, Fuchuan
2018-04-01
In this paper we propose a novel face recognition approach based on slow feature analysis (SFA) in contourlet transform domain. This method firstly use contourlet transform to decompose the face image into low frequency and high frequency part, and then takes technological advantages of slow feature analysis for facial feature extraction. We named the new method combining the slow feature analysis and contourlet transform as CT-SFA. The experimental results on international standard face database demonstrate that the new face recognition method is effective and competitive.
Systems and methods for detection of blowout precursors in combustors
Lieuwen, Tim C.; Nair, Suraj
2006-08-15
The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.
A combined method for correlative 3D imaging of biological samples from macro to nano scale
NASA Astrophysics Data System (ADS)
Kellner, Manuela; Heidrich, Marko; Lorbeer, Raoul-Amadeus; Antonopoulos, Georgios C.; Knudsen, Lars; Wrede, Christoph; Izykowski, Nicole; Grothausmann, Roman; Jonigk, Danny; Ochs, Matthias; Ripken, Tammo; Kühnel, Mark P.; Meyer, Heiko
2016-10-01
Correlative analysis requires examination of a specimen from macro to nano scale as well as applicability of analytical methods ranging from morphological to molecular. Accomplishing this with one and the same sample is laborious at best, due to deformation and biodegradation during measurements or intermediary preparation steps. Furthermore, data alignment using differing imaging techniques turns out to be a complex task, which considerably complicates the interconnection of results. We present correlative imaging of the accessory rat lung lobe by combining a modified Scanning Laser Optical Tomography (SLOT) setup with a specially developed sample preparation method (CRISTAL). CRISTAL is a resin-based embedding method that optically clears the specimen while allowing sectioning and preventing degradation. We applied and correlated SLOT with Multi Photon Microscopy, histological and immunofluorescence analysis as well as Transmission Electron Microscopy, all in the same sample. Thus, combining CRISTAL with SLOT enables the correlative utilization of a vast variety of imaging techniques.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber (Apostichopus japonicus) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China. PMID:29410795
Guo, Xiuhan; Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber ( Apostichopus japonicus ) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Liu, Ming; Zhao, Jing; Lu, XiaoZuo; Li, Gang; Wu, Taixia; Zhang, LiFu
2018-05-10
With spectral methods, noninvasive determination of blood hyperviscosity in vivo is very potential and meaningful in clinical diagnosis. In this study, 67 male subjects (41 health, and 26 hyperviscosity according to blood sample analysis results) participate. Reflectance spectra of subjects' tongue tips is measured, and a classification method bases on principal component analysis combined with artificial neural network model is built to identify hyperviscosity. Hold-out and Leave-one-out methods are used to avoid significant bias and lessen overfitting problem, which are widely accepted in the model validation. To measure the performance of the classification, sensitivity, specificity, accuracy and F-measure are calculated, respectively. The accuracies with 100 times Hold-out method and 67 times Leave-one-out method are 88.05% and 97.01%, respectively. Experimental results indicate that the built classification model has certain practical value and proves the feasibility of using spectroscopy to identify hyperviscosity by noninvasive determination.
ERIC Educational Resources Information Center
Beach, Derek; Rohlfing, Ingo
2018-01-01
In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods--"qualitative comparative analysis" (QCA) and typological theory (TT)--and their combination with process tracing (PT).…
Spatial analysis of sunshine duration by combination of satellite and station data
NASA Astrophysics Data System (ADS)
Frei, C.; Stöckli, R.; Dürr, B.
2009-09-01
Sunshine duration can exhibit rich fine scale patterns associated with special meteorological phenomena, such as fog layers and topographically triggered clouds. Networks of climate stations are mostly too coarse and poorly representative to resolve these patterns explicitly. We present a method which combines station observations with satellite-derived cloud-cover data to produce km-scale fields of sunshine duration. The method is not relying on contemporous satellite information, hence it can be applied over climatological time scales. We apply and evaluate the combination method over the territory of Switzerland. The combination method is based on Universal Kriging. First, the satellite data (a Heliosat clear sky index from MSG, extending over a 5 year preiod) is subjected to a S-mode Principal Component (PC) Analysis. Second, a set of leading PC loadings (seasonally stratified) is introduced as external drift covariates and their optimal linear combination is estimated from the station data (70 stations). Finally, the stochastic component is an autocorrelated field with an exponential variogram, estimated climatologically for each calendar month. For Switzerland the leading PCs of the clear sky index depict familiar patterns of cloud variability which are inhereted in the combination process. The resulting sunshine duration fields exhibit fine-scale structures that are physically plausible, linked to the topography and characteristic of the regional climate. These patterns could not be inferred from station data and/or topographic predictors alone. A cross-validation reveals that the combination method explains between 80-90% of the spatial variance in winter and autumn months. In spring and summer the relative performance is lower (60-75% explained spatial variance) but absolute errors are smaller. Our presentation will also discuss some results from a climatology of the derived sunshine duration fields.
Thematic Analysis of the Children's Drawings on Museum Visit: Adaptation of the Kuhn's Method
ERIC Educational Resources Information Center
Kisovar-Ivanda, Tamara
2014-01-01
Researchers are using techniques that allow children to express their perspectives. In 2003, Kuhn developed the method of data collection and analysis which combined thematic drawing and focused, episodic interview. In this article the Kuhn's method is adjusted using the draw and write technique as a research methodology. Reflections on the…
ERIC Educational Resources Information Center
Verde, Pablo E.; Ohmann, Christian
2015-01-01
Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it…
Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Haller, Harold S.
2009-01-01
It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.
Discussion of the Method to Determine the Ultimate Bearing Capacity of Soil Foundation
NASA Astrophysics Data System (ADS)
Du, Peng; Liu, Xiaoling; Zhang, Yangfu
2017-12-01
Combining literature examples, this paper has carried out Contrastive analysis of the theoretical formula method and finite element method about the ultimate bearing capacity of foundation, To verify rationality and superiority of the incremental load method in finite element ABAQUS in solving the bearing capacity of foundation soil. The study can provide certain reference for practical engineering calculation and analysis of foundation bearing capacity.
Chuang, Li-Yeh; Moi, Sin-Hua; Lin, Yu-Da; Yang, Cheng-Hong
2016-10-01
Evolutionary algorithms could overcome the computational limitations for the statistical evaluation of large datasets for high-order single nucleotide polymorphism (SNP) barcodes. Previous studies have proposed several chaotic particle swarm optimization (CPSO) methods to detect SNP barcodes for disease analysis (e.g., for breast cancer and chronic diseases). This work evaluated additional chaotic maps combined with the particle swarm optimization (PSO) method to detect SNP barcodes using a high-dimensional dataset. Nine chaotic maps were used to improve PSO method results and compared the searching ability amongst all CPSO methods. The XOR and ZZ disease models were used to compare all chaotic maps combined with PSO method. Efficacy evaluations of CPSO methods were based on statistical values from the chi-square test (χ 2 ). The results showed that chaotic maps could improve the searching ability of PSO method when population are trapped in the local optimum. The minor allele frequency (MAF) indicated that, amongst all CPSO methods, the numbers of SNPs, sample size, and the highest χ 2 value in all datasets were found in the Sinai chaotic map combined with PSO method. We used the simple linear regression results of the gbest values in all generations to compare the all methods. Sinai chaotic map combined with PSO method provided the highest β values (β≥0.32 in XOR disease model and β≥0.04 in ZZ disease model) and the significant p-value (p-value<0.001 in both the XOR and ZZ disease models). The Sinai chaotic map was found to effectively enhance the fitness values (χ 2 ) of PSO method, indicating that the Sinai chaotic map combined with PSO method is more effective at detecting potential SNP barcodes in both the XOR and ZZ disease models. Copyright © 2016 Elsevier B.V. All rights reserved.
Liu, Fei; Wang, Yuan-zhong; Yang, Chun-yan; Jin, Hang
2015-01-01
The genuineness and producing area of Panax notoginseng were studied based on infrared spectroscopy combined with discriminant analysis. The infrared spectra of 136 taproots of P. notoginseng from 13 planting point in 11 counties were collected and the second derivate spectra were calculated by Omnic 8. 0 software. The infrared spectra and their second derivate spectra in the range 1 800 - 700 cm-1 were used to build model by stepwise discriminant analysis, which was in order to distinguish study on the genuineness of P. notoginseng. The model built based on the second derivate spectra showed the better recognition effect for the genuineness of P. notoginseng. The correct rate of returned classification reached to 100%, and the prediction accuracy was 93. 4%. The stability of model was tested by cross validation and the method was performed extrapolation validation. The second derivate spectra combined with the same discriminant analysis method were used to distinguish the producing area of P. notoginseng. The recognition effect of models built based on different range of spectrum and different numbers of samples were compared and found that when the model was built by collecting 8 samples from each planting point as training sample and the spectrum in the range 1 500 - 1 200 cm-1 , the recognition effect was better, with the correct rate of returned classification reached to 99. 0%, and the prediction accuracy was 76. 5%. The results indicated that infrared spectroscopy combined with discriminant analysis showed good recognition effect for the genuineness of P. notoginseng. The method might be a hopeful new method for identification of genuineness of P. notoginseng in practice. The method could recognize the producing area of P. notoginseng to some extent and could be a new thought for identification of the producing area of P. natoginseng.
Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri
2017-06-01
Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Immunoaffinity chromatography: an introduction to applications and recent developments
Moser, Annette C
2010-01-01
Immunoaffinity chromatography (IAC) combines the use of LC with the specific binding of antibodies or related agents. The resulting method can be used in assays for a particular target or for purification and concentration of analytes prior to further examination by another technique. This review discusses the history and principles of IAC and the various formats that can be used with this method. An overview is given of the general properties of antibodies and of antibody-production methods. The supports and immobilization methods used with antibodies in IAC and the selection of application and elution conditions for IAC are also discussed. Several applications of IAC are considered, including its use in purification, immunodepletion, direct sample analysis, chromatographic immunoassays and combined analysis methods. Recent developments include the use of IAC with CE or MS, ultrafast immunoextraction methods and the use of immunoaffinity columns in microanalytical systems. PMID:20640220
Kahan, Brennan C; Harhay, Michael O
2015-12-01
Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
2018-01-01
Background and Objective. Needle electromyography can be used to detect the number of changes and morphological changes in motor unit potentials of patients with axonal neuropathy. General mathematical methods of pattern recognition and signal analysis were applied to recognize neuropathic changes. This study validates the possibility of extending and refining turns-amplitude analysis using permutation entropy and signal energy. Methods. In this study, we examined needle electromyography in 40 neuropathic individuals and 40 controls. The number of turns, amplitude between turns, signal energy, and “permutation entropy” were used as features for support vector machine classification. Results. The obtained results proved the superior classification performance of the combinations of all of the above-mentioned features compared to the combinations of fewer features. The lowest accuracy from the tested combinations of features had peak-ratio analysis. Conclusion. Using the combination of permutation entropy with signal energy, number of turns and mean amplitude in SVM classification can be used to refine the diagnosis of polyneuropathies examined by needle electromyography. PMID:29606959
Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L
2017-02-01
The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Hai-Yan; Song, Chao; Sha, Min; Liu, Jun; Li, Li-Ping; Zhang, Zheng-Yong
2018-05-01
Raman spectra and ultraviolet-visible absorption spectra of four different geographic origins of Radix Astragali were collected. These data were analyzed using kernel principal component analysis combined with sparse representation classification. The results showed that the recognition rate reached 70.44% using Raman spectra for data input and 90.34% using ultraviolet-visible absorption spectra for data input. A new fusion method based on Raman combined with ultraviolet-visible data was investigated and the recognition rate was increased to 96.43%. The experimental results suggested that the proposed data fusion method effectively improved the utilization rate of the original data.
Spoofing detection on facial images recognition using LBP and GLCM combination
NASA Astrophysics Data System (ADS)
Sthevanie, F.; Ramadhani, K. N.
2018-03-01
The challenge for the facial based security system is how to detect facial image falsification such as facial image spoofing. Spoofing occurs when someone try to pretend as a registered user to obtain illegal access and gain advantage from the protected system. This research implements facial image spoofing detection method by analyzing image texture. The proposed method for texture analysis combines the Local Binary Pattern (LBP) and Gray Level Co-occurrence Matrix (GLCM) method. The experimental results show that spoofing detection using LBP and GLCM combination achieves high detection rate compared to that of using only LBP feature or GLCM feature.
USDA-ARS?s Scientific Manuscript database
A new method of sample preparation was developed and is reported for the first time. The approach combines in-vial filtration with dispersive solid-phase extraction (d-SPE) in a fast and convenient cleanup of QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts. The method was appli...
The combination of the error correction methods of GAFCHROMIC EBT3 film
Li, Yinghui; Chen, Lixin; Zhu, Jinhan; Liu, Xiaowei
2017-01-01
Purpose The aim of this study was to combine a set of methods for use of radiochromic film dosimetry, including calibration, correction for lateral effects and a proposed triple-channel analysis. These methods can be applied to GAFCHROMIC EBT3 film dosimetry for radiation field analysis and verification of IMRT plans. Methods A single-film exposure was used to achieve dose calibration, and the accuracy was verified based on comparisons with the square-field calibration method. Before performing the dose analysis, the lateral effects on pixel values were corrected. The position dependence of the lateral effect was fitted by a parabolic function, and the curvature factors of different dose levels were obtained using a quadratic formula. After lateral effect correction, a triple-channel analysis was used to reduce disturbances and convert scanned images from films into dose maps. The dose profiles of open fields were measured using EBT3 films and compared with the data obtained using an ionization chamber. Eighteen IMRT plans with different field sizes were measured and verified with EBT3 films, applying our methods, and compared to TPS dose maps, to check correct implementation of film dosimetry proposed here. Results The uncertainty of lateral effects can be reduced to ±1 cGy. Compared with the results of Micke A et al., the residual disturbances of the proposed triple-channel method at 48, 176 and 415 cGy are 5.3%, 20.9% and 31.4% smaller, respectively. Compared with the ionization chamber results, the difference in the off-axis ratio and percentage depth dose are within 1% and 2%, respectively. For the application of IMRT verification, there were no difference between two triple-channel methods. Compared with only corrected by triple-channel method, the IMRT results of the combined method (include lateral effect correction and our present triple-channel method) show a 2% improvement for large IMRT fields with the criteria 3%/3 mm. PMID:28750023
NASA Astrophysics Data System (ADS)
Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede
2017-10-01
Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.
Some New Mathematical Methods for Variational Objective Analysis
NASA Technical Reports Server (NTRS)
Wahba, G.; Johnson, D. R.
1984-01-01
New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.
Cappella, Annalisa; Gibelli, Daniele; Muccino, Enrico; Scarpulla, Valentina; Cerutti, Elisa; Caruso, Valentina; Sguazza, Emanuela; Mazzarelli, Debora; Cattaneo, Cristina
2015-01-27
When estimating post-mortem interval (PMI) in forensic anthropology, the only method able to give an unambiguous result is the analysis of C-14, although the procedure is expensive. Other methods, such as luminol tests and histological analysis, can be performed as preliminary investigations and may allow the operators to gain a preliminary indication concerning PMI, but they lack scientific verification, although luminol testing has been somewhat more accredited in the past few years. Such methods in fact may provide some help as they are inexpensive and can give a fast response, especially in the phase of preliminary investigations. In this study, 20 court cases of human skeletonized remains were dated by the C-14 method. For two cases, results were chronologically set after the 1950s; for one case, the analysis was not possible technically. The remaining 17 cases showed an archaeological or historical collocation. The same bone samples were also screened with histological examination and with the luminol test. Results showed that only four cases gave a positivity to luminol and a high Oxford Histology Index (OHI) score at the same time: among these, two cases were dated as recent by the radiocarbon analysis. Thus, only two false-positive results were given by the combination of these methods and no false negatives. Thus, the combination of two qualitative methods (luminol test and microscopic analysis) may represent a promising solution to cases where many fragments need to be quickly tested.
A stable systemic risk ranking in China's banking sector: Based on principal component analysis
NASA Astrophysics Data System (ADS)
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
The detection methods of dynamic objects
NASA Astrophysics Data System (ADS)
Knyazev, N. L.; Denisova, L. A.
2018-01-01
The article deals with the application of cluster analysis methods for solving the task of aircraft detection on the basis of distribution of navigation parameters selection into groups (clusters). The modified method of cluster analysis for search and detection of objects and then iterative combining in clusters with the subsequent count of their quantity for increase in accuracy of the aircraft detection have been suggested. The course of the method operation and the features of implementation have been considered. In the conclusion the noted efficiency of the offered method for exact cluster analysis for finding targets has been shown.
QCL spectroscopy combined with the least squares method for substance analysis
NASA Astrophysics Data System (ADS)
Samsonov, D. A.; Tabalina, A. S.; Fufurin, I. L.
2017-11-01
The article briefly describes distinctive features of quantum cascade lasers (QCL). It also describes an experimental set-up for acquiring mid-infrared absorption spectra using QCL. The paper demonstrates experimental results in the form of normed spectra. We tested the application of the least squares method for spectrum analysis. We used this method for substance identification and extraction of concentration data. We compare the results with more common methods of absorption spectroscopy. Eventually, we prove the feasibility of using this simple method for quantitative and qualitative analysis of experimental data acquired with QCL.
2013-11-26
Combination with Simple Features," lEE European Workshop on Handwriting Analysis and Recognition, pp. 6/1-6, Brussels, Jul. 1994. Bock, J., et a...Document Analysis and Recognition, pp. 147-150, Oct. 1993. Starner, T., eta!., "On-Line Cursive Handwriting Recognition Using Speech Recognition Methods
Denisova, Galina F; Denisov, Dimitri A; Yeung, Jeffrey; Loeb, Mark B; Diamond, Michael S; Bramson, Jonathan L
2008-11-01
Understanding antibody function is often enhanced by knowledge of the specific binding epitope. Here, we describe a computer algorithm that permits epitope prediction based on a collection of random peptide epitopes (mimotopes) isolated by antibody affinity purification. We applied this methodology to the prediction of epitopes for five monoclonal antibodies against the West Nile virus (WNV) E protein, two of which exhibit therapeutic activity in vivo. This strategy was validated by comparison of our results with existing F(ab)-E protein crystal structures and mutational analysis by yeast surface display. We demonstrate that by combining the results of the mimotope method with our data from mutational analysis, epitopes could be predicted with greater certainty. The two methods displayed great complementarity as the mutational analysis facilitated epitope prediction when the results with the mimotope method were equivocal and the mimotope method revealed a broader number of residues within the epitope than the mutational analysis. Our results demonstrate that the combination of these two prediction strategies provides a robust platform for epitope characterization.
Extending methods: using Bourdieu's field analysis to further investigate taste
NASA Astrophysics Data System (ADS)
Schindel Dimick, Alexandra
2015-06-01
In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.
Abdelaleem, Eglal Adelhamid; Abdelwahab, Nada Sayed
2013-01-01
This work is concerned with development and validation of chromatographic and spectrophotometric methods for analysis of mebeverine HCl (MEH), diloxanide furoate (DF) and metronidazole (MET) in Dimetrol® tablets - spectrophotometric and RP-HPLC methods using UV detection. The developed spectrophotometric methods depend on determination of MEH and DF in the combined dosage form using the successive derivative ratio spectra method which depends on derivatization of the obtained ratio spectra in two steps using methanol as a solvent and measuring MEH at 226.4-232.2 nm (peak to peak) and DF at 260.6-264.8 nm (peak to peak). While MET concentrations were determined using first derivative (1D) at λ = 327 nm using the same solvent. The chromatographic method depends on HPLC separation on ODS column and elution with a mobile phase consisting water: methanol: triethylamine (25: 75: 0.5, by volume, orthophosphoric acid to pH =4). Pumping the mobile phase at 0.7 ml min-1 with UV at 230 nm. Factors affecting the developed methods were studied and optimized, moreover, they have been validated as per ICH guideline and the results demonstrated that the suggested methods are reproducible, reliable and can be applied for routine use with short time of analysis. Statistical analysis of the two developed methods with each other using F and student's-t tests showed no significant difference.
Liver Resections Combined with Closure of Loop Ileostomies: A Retrospective Analysis
Lordan, Jeffrey T.; Riga, Angela T.; Karanjia, Nariman D.
2008-01-01
Background. The management of patients with colorectal liver metastases and loop ileostomies remains controversial. This study was performed to assess the outcome of combined liver resection and loop ileostomy closure. Methods. Analysis of prospectively collected perioperative data, including morbidity and mortality, of 283 consecutive hepatectomies for colorectal liver metastases was undertaken. Consecutive liver resections were performed from 1996 to 2006 in one centre by a single surgeon (NDK). Fourteen of these patients had combined liver resection and ileostomy closure. Case-matched analysis was undertaken. Results. Six (2.2%) patients died in the hepatectomy only group and none died in the combined group. There was no difference in operative blood loss between the two groups (0.09). Perioperative morbidity was 36% in the combined group and 23% in the hepatectomy alone group (P = 0.33). Mean hospital stay was 14 days in the combined group and 11 days in the hepatectomy only group (P = 0.046). Case-matched analysis showed a significant increase in hospital stay (P = 0.03) and complications (P = 0.049) in the combined group. Conclusion. In patients with CRLM, combined liver resection and closure of ileostomy may be associated with a higher operative morbidity and a prolonged hospital stay. PMID:19096524
Combining conversation analysis and event sequencing to study health communication.
Pecanac, Kristen E
2018-06-01
Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Quantitative Determination of Caffeine in Beverages Using a Combined SPME-GC/MS Method
NASA Astrophysics Data System (ADS)
Pawliszyn, Janusz; Yang, Min J.; Orton, Maureen L.
1997-09-01
Solid-phase microextraction (SPME) combined with gas chromatography/mass spectrometry (GC/MS) has been applied to the analysis of various caffeinated beverages. Unlike the current methods, this technique is solvent free and requires no pH adjustments. The simplicity of the SPME-GC/MS method lends itself to a good undergraduate laboratory practice. This publication describes the analytical conditions and presents the data for determination of caffeine in coffee, tea, and coke. Quantitation by isotopic dilution is also illustrated.
Application of USNRC NUREG/CR-6661 and draft DG-1108 to evolutionary and advanced reactor designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang 'Apollo', Chen
2006-07-01
For the seismic design of evolutionary and advanced nuclear reactor power plants, there are definite financial advantages in the application of USNRC NUREG/CR-6661 and draft Regulatory Guide DG-1108. NUREG/CR-6661, 'Benchmark Program for the Evaluation of Methods to Analyze Non-Classically Damped Coupled Systems', was by Brookhaven National Laboratory (BNL) for the USNRC, and Draft Regulatory Guide DG-1108 is the proposed revision to the current Regulatory Guide (RG) 1.92, Revision 1, 'Combining Modal Responses and Spatial Components in Seismic Response Analysis'. The draft Regulatory Guide DG-1108 is available at http://members.cox.net/apolloconsulting, which also provides a link to the USNRC ADAMS site to searchmore » for NUREG/CR-6661 in text file or image file. The draft Regulatory Guide DG-1108 removes unnecessary conservatism in the modal combinations for closely spaced modes in seismic response spectrum analysis. Its application will be very helpful in coupled seismic analysis for structures and heavy equipment to reduce seismic responses and in piping system seismic design. In the NUREG/CR-6661 benchmark program, which investigated coupled seismic analysis of structures and equipment or piping systems with different damping values, three of the four participants applied the complex mode solution method to handle different damping values for structures, equipment, and piping systems. The fourth participant applied the classical normal mode method with equivalent weighted damping values to handle differences in structural, equipment, and piping system damping values. Coupled analysis will reduce the equipment responses when equipment, or piping system and structure are in or close to resonance. However, this reduction in responses occurs only if the realistic DG-1108 modal response combination method is applied, because closely spaced modes will be produced when structure and equipment or piping systems are in or close to resonance. Otherwise, the conservatism in the current Regulatory Guide 1.92, Revision 1, will overshadow the advantage of coupled analysis. All four participants applied the realistic modal combination method of DG-1108. Consequently, more realistic and reduced responses were obtained. (authors)« less
[Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].
Zheng, Chang-song; Ma, Biao
2009-04-01
The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.
Baldrian, Petr; López-Mondéjar, Rubén
2014-02-01
Molecular methods for the analysis of biomolecules have undergone rapid technological development in the last decade. The advent of next-generation sequencing methods and improvements in instrumental resolution enabled the analysis of complex transcriptome, proteome and metabolome data, as well as a detailed annotation of microbial genomes. The mechanisms of decomposition by model fungi have been described in unprecedented detail by the combination of genome sequencing, transcriptomics and proteomics. The increasing number of available genomes for fungi and bacteria shows that the genetic potential for decomposition of organic matter is widespread among taxonomically diverse microbial taxa, while expression studies document the importance of the regulation of expression in decomposition efficiency. Importantly, high-throughput methods of nucleic acid analysis used for the analysis of metagenomes and metatranscriptomes indicate the high diversity of decomposer communities in natural habitats and their taxonomic composition. Today, the metaproteomics of natural habitats is of interest. In combination with advanced analytical techniques to explore the products of decomposition and the accumulation of information on the genomes of environmentally relevant microorganisms, advanced methods in microbial ecophysiology should increase our understanding of the complex processes of organic matter transformation.
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh
2012-10-10
A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems
2013-01-01
This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns. PMID:23286457
Novel scanning procedure enabling the vectorization of entire rhizotron-grown root systems.
Lobet, Guillaume; Draye, Xavier
2013-01-04
: This paper presents an original spit-and-combine imaging procedure that enables the complete vectorization of complex root systems grown in rhizotrons. The general principle of the method is to (1) separate the root system into a small number of large pieces to reduce root overlap, (2) scan these pieces one by one, (3) analyze separate images with a root tracing software and (4) combine all tracings into a single vectorized root system. This method generates a rich dataset containing morphological, topological and geometrical information of entire root systems grown in rhizotrons. The utility of the method is illustrated with a detailed architectural analysis of a 20-day old maize root system, coupled with a spatial analysis of water uptake patterns.
Ji, Yue; Xu, Mengjie; Li, Xingfei; Wu, Tengfei; Tuo, Weixiao; Wu, Jun; Dong, Jiuzhi
2018-06-13
The magnetohydrodynamic (MHD) angular rate sensor (ARS) with low noise level in ultra-wide bandwidth is developed in lasing and imaging applications, especially the line-of-sight (LOS) system. A modified MHD ARS combined with the Coriolis effect was studied in this paper to expand the sensor’s bandwidth at low frequency (<1 Hz), which is essential for precision LOS pointing and wide-bandwidth LOS jitter suppression. The model and the simulation method were constructed and a comprehensive solving method based on the magnetic and electric interaction methods was proposed. The numerical results on the Coriolis effect and the frequency response of the modified MHD ARS were detailed. In addition, according to the experimental results of the designed sensor consistent with the simulation results, an error analysis of model errors was discussed. Our study provides an error analysis method of MHD ARS combined with the Coriolis effect and offers a framework for future studies to minimize the error.
NASA Astrophysics Data System (ADS)
Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si
2018-02-01
In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.
Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén
2006-04-21
A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.
HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.
Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua
2014-03-01
Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.
NASA Astrophysics Data System (ADS)
Liu, Yong; Qin, Zhimeng; Hu, Baodan; Feng, Shuai
2018-04-01
Stability analysis is of great significance to landslide hazard prevention, especially the dynamic stability. However, many existing stability analysis methods are difficult to analyse the continuous landslide stability and its changing regularities in a uniform criterion due to the unique landslide geological conditions. Based on the relationship between displacement monitoring data, deformation states and landslide stability, a state fusion entropy method is herein proposed to derive landslide instability through a comprehensive multi-attribute entropy analysis of deformation states, which are defined by a proposed joint clustering method combining K-means and a cloud model. Taking Xintan landslide as the detailed case study, cumulative state fusion entropy presents an obvious increasing trend after the landslide entered accelerative deformation stage and historical maxima match highly with landslide macroscopic deformation behaviours in key time nodes. Reasonable results are also obtained in its application to several other landslides in the Three Gorges Reservoir in China. Combined with field survey, state fusion entropy may serve for assessing landslide stability and judging landslide evolutionary stages.
Forment, Josep V.; Jackson, Stephen P.
2016-01-01
Protein accumulation on chromatin has traditionally been studied using immunofluorescence microscopy or biochemical cellular fractionation followed by western immunoblot analysis. As a way to improve the reproducibility of this kind of analysis, make it easier to quantify and allow a stream-lined application in high-throughput screens, we recently combined a classical immunofluorescence microscopy detection technique with flow cytometry1. In addition to the features described above, and by combining it with detection of both DNA content and DNA replication, this method allows unequivocal and direct assignment of cell-cycle distribution of protein association to chromatin without the need for cell culture synchronization. Furthermore, it is relatively quick (no more than a working day from sample collection to quantification), requires less starting material compared to standard biochemical fractionation methods and overcomes the need for flat, adherent cell types that are required for immunofluorescence microscopy. PMID:26226461
In vivo stationary flux analysis by 13C labeling experiments.
Wiechert, W; de Graaf, A A
1996-01-01
Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.
On 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.
1986-01-01
Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
The 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
Combining markers with and without the limit of detection
Dong, Ting; Liu, Catherine Chunling; Petricoin, Emanuel F.; Tang, Liansheng Larry
2014-01-01
In this paper, we consider the combination of markers with and without the limit of detection (LOD). LOD is often encountered when measuring proteomic markers. Because of the limited detecting ability of an equipment or instrument, it is difficult to measure markers at a relatively low level. Suppose that after some monotonic transformation, the marker values approximately follow multivariate normal distributions. We propose to estimate distribution parameters while taking the LOD into account, and then combine markers using the results from the linear discriminant analysis. Our simulation results show that the ROC curve parameter estimates generated from the proposed method are much closer to the truth than simply using the linear discriminant analysis to combine markers without considering the LOD. In addition, we propose a procedure to select and combine a subset of markers when many candidate markers are available. The procedure based on the correlation among markers is different from a common understanding that a subset of the most accurate markers should be selected for the combination. The simulation studies show that the accuracy of a combined marker can be largely impacted by the correlation of marker measurements. Our methods are applied to a protein pathway dataset to combine proteomic biomarkers to distinguish cancer patients from non-cancer patients. PMID:24132938
Factor Analysis via Components Analysis
ERIC Educational Resources Information Center
Bentler, Peter M.; de Leeuw, Jan
2011-01-01
When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…
Effects of eye artifact removal methods on single trial P300 detection, a comparative study.
Ghaderi, Foad; Kim, Su Kyoung; Kirchner, Elsa Andrea
2014-01-15
Electroencephalographic signals are commonly contaminated by eye artifacts, even if recorded under controlled conditions. The objective of this work was to quantitatively compare standard artifact removal methods (regression, filtered regression, Infomax, and second order blind identification (SOBI)) and two artifact identification approaches for independent component analysis (ICA) methods, i.e. ADJUST and correlation. To this end, eye artifacts were removed and the cleaned datasets were used for single trial classification of P300 (a type of event related potentials elicited using the oddball paradigm). Statistical analysis of the results confirms that the combination of Infomax and ADJUST provides a relatively better performance (0.6% improvement on average of all subject) while the combination of SOBI and correlation performs the worst. Low-pass filtering the data at lower cutoffs (here 4 Hz) can also improve the classification accuracy. Without requiring any artifact reference channel, the combination of Infomax and ADJUST improves the classification performance more than the other methods for both examined filtering cutoffs, i.e., 4 Hz and 25 Hz. Copyright © 2013 Elsevier B.V. All rights reserved.
Validation of an ultra-fast UPLC-UV method for the separation of antituberculosis tablets.
Nguyen, Dao T-T; Guillarme, Davy; Rudaz, Serge; Veuthey, Jean-Luc
2008-04-01
A simple method using ultra performance LC (UPLC) coupled with UV detection was developed and validated for the determination of antituberculosis drugs in combined dosage form, i. e. isoniazid (ISN), pyrazinamide (PYR) and rifampicin (RIF). Drugs were separated on a short column (2.1 mm x 50 mm) packed with 1.7 mum particles, using an elution gradient procedure. At 30 degrees C, less than 2 min was necessary for the complete separation of the three antituberculosis drugs, while the original USP method was performed in 15 min. Further improvements were obtained with the combination of UPLC and high temperature (up to 90 degrees C), namely HT-UPLC, which allows the application of higher mobile phase flow rates. Therefore, the separation of ISN, PYR and RIF was performed in less than 1 min. After validation (selectivity, trueness, precision and accuracy), both methods (UPLC and HT-UPLC) have proven suitable for the routine quality control analysis of antituberculosis drugs in combined dosage form. Additionally, a large number of samples per day can be analysed due to the short analysis times.
Improved score statistics for meta-analysis in single-variant and gene-level association studies.
Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo
2018-06-01
Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.
Meta-analysis of pathway enrichment: combining independent and dependent omics data sets.
Kaever, Alexander; Landesfeind, Manuel; Feussner, Kirstin; Morgenstern, Burkhard; Feussner, Ivo; Meinicke, Peter
2014-01-01
A major challenge in current systems biology is the combination and integrative analysis of large data sets obtained from different high-throughput omics platforms, such as mass spectrometry based Metabolomics and Proteomics or DNA microarray or RNA-seq-based Transcriptomics. Especially in the case of non-targeted Metabolomics experiments, where it is often impossible to unambiguously map ion features from mass spectrometry analysis to metabolites, the integration of more reliable omics technologies is highly desirable. A popular method for the knowledge-based interpretation of single data sets is the (Gene) Set Enrichment Analysis. In order to combine the results from different analyses, we introduce a methodical framework for the meta-analysis of p-values obtained from Pathway Enrichment Analysis (Set Enrichment Analysis based on pathways) of multiple dependent or independent data sets from different omics platforms. For dependent data sets, e.g. obtained from the same biological samples, the framework utilizes a covariance estimation procedure based on the nonsignificant pathways in single data set enrichment analysis. The framework is evaluated and applied in the joint analysis of Metabolomics mass spectrometry and Transcriptomics DNA microarray data in the context of plant wounding. In extensive studies of simulated data set dependence, the introduced correlation could be fully reconstructed by means of the covariance estimation based on pathway enrichment. By restricting the range of p-values of pathways considered in the estimation, the overestimation of correlation, which is introduced by the significant pathways, could be reduced. When applying the proposed methods to the real data sets, the meta-analysis was shown not only to be a powerful tool to investigate the correlation between different data sets and summarize the results of multiple analyses but also to distinguish experiment-specific key pathways.
Alkass, Kanar; Buchholz, Bruce A; Ohtani, Susumu; Yamamoto, Toshiharu; Druid, Henrik; Spalding, Kirsty L
2010-05-01
Age determination of unknown human bodies is important in the setting of a crime investigation or a mass disaster because the age at death, birth date, and year of death as well as gender can guide investigators to the correct identity among a large number of possible matches. Traditional morphological methods used by anthropologists to determine age are often imprecise, whereas chemical analysis of tooth dentin, such as aspartic acid racemization, has shown reproducible and more precise results. In this study, we analyzed teeth from Swedish individuals using both aspartic acid racemization and radiocarbon methodologies. The rationale behind using radiocarbon analysis is that aboveground testing of nuclear weapons during the cold war (1955-1963) caused an extreme increase in global levels of carbon-14 ((14)C), which has been carefully recorded over time. Forty-four teeth from 41 individuals were analyzed using aspartic acid racemization analysis of tooth crown dentin or radiocarbon analysis of enamel, and 10 of these were split and subjected to both radiocarbon and racemization analysis. Combined analysis showed that the two methods correlated well (R(2) = 0.66, p < 0.05). Radiocarbon analysis showed an excellent precision with an overall absolute error of 1.0 +/- 0.6 years. Aspartic acid racemization also showed a good precision with an overall absolute error of 5.4 +/- 4.2 years. Whereas radiocarbon analysis gives an estimated year of birth, racemization analysis indicates the chronological age of the individual at the time of death. We show how these methods in combination can also assist in the estimation of date of death of an unidentified victim. This strategy can be of significant assistance in forensic casework involving dead victim identification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alkass, K; Buchholz, B A; Ohtani, S
Age determination of unknown human bodies is important in the setting of a crime investigation or a mass disaster, since the age at death, birth date and year of death, as well as gender, can guide investigators to the correct identity among a large number of possible matches. Traditional morphological methods used by anthropologists to determine age are often imprecise, whereas chemical analysis of tooth dentin, such as aspartic acid racemization has shown reproducible and more precise results. In this paper we analyze teeth from Swedish individuals using both aspartic acid racemization and radiocarbon methodologies. The rationale behind using radiocarbonmore » analysis is that above-ground testing of nuclear weapons during the cold war (1955-1963) caused an extreme increase in global levels of carbon-14 ({sup 14}C) which have been carefully recorded over time. Forty-four teeth from 41 individuals were analyzed using aspartic acid racemization analysis of tooth crown dentin or radiocarbon analysis of enamel and ten of these were split and subjected to both radiocarbon and racemization analysis. Combined analysis showed that the two methods correlated well (R2=0.66, p < 0.05). Radiocarbon analysis showed an excellent precision with an overall absolute error of 0.6 {+-} 04 years. Aspartic acid racemization also showed a good precision with an overall absolute error of 5.4 {+-} 4.2 years. Whereas radiocarbon analysis gives an estimated year of birth, racemization analysis indicates the chronological age of the individual at the time of death. We show how these methods in combination can also assist in the estimation of date of death of an unidentified victim. This strategy can be of significant assistance in forensic casework involving dead victim identification.« less
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Wang, Ning; Li, Zhi-Yong; Zheng, Xiao-Li; Li, Qiao; Yang, Xin; Xu, Hui
2018-04-09
Kumu injection (KMI) is a common-used traditional Chinese medicine (TCM) preparation made from Picrasma quassioides (D. Don) Benn. rich in alkaloids. An innovative technique for quality assessment of KMI was developed using high performance liquid chromatography (HPLC) combined with chemometric methods and qualitative and quantitative analysis of multi-components by single marker (QAMS). Nigakinone (PQ-6, 5-hydroxy-4-methoxycanthin-6-one), one of the most abundant alkaloids responsible for the major pharmacological activities of Kumu, was used as a reference substance. Six alkaloids in KMI were quantified, including 6-hydroxy- β -carboline-1-carboxylic acid (PQ-1), 4,5-dimethoxycanthin-6-one (PQ-2), β -carboline-1-carboxylic acid (PQ-3), β -carboline-1-propanoic acid (PQ-4), 3-methylcanthin-5,6-dione (PQ-5), and PQ-6. Based on the outcomes of twenty batches of KMI samples, the contents of six alkaloids were used for further chemometric analysis. By hierarchical cluster analysis (HCA), radar plots, and principal component analysis (PCA), all the KMI samples could be categorized into three groups, which were closely related to production date and indicated the crucial influence of herbal raw material on end products of KMI. QAMS combined with chemometric analysis could accurately measure and clearly distinguish the different quality samples of KMI. Hence, QAMS is a feasible and promising method for the quality control of KMI.
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Analysis and optimization methods for centralized processing of chassis.
DOT National Transportation Integrated Search
2017-02-01
The twin ports of Long Beach (POLB) and Los Angeles (POLA), consisting of fourteen individually gated terminals, combine to create the largest container port complex in the US. In 2015, the combined ports handled 15.4 million 20-foot equivalent units...
Dai, Hongying; Wu, Guodong; Wu, Michael; Zhi, Degui
2016-01-01
Next-generation sequencing data pose a severe curse of dimensionality, complicating traditional "single marker-single trait" analysis. We propose a two-stage combined p-value method for pathway analysis. The first stage is at the gene level, where we integrate effects within a gene using the Sequence Kernel Association Test (SKAT). The second stage is at the pathway level, where we perform a correlated Lancaster procedure to detect joint effects from multiple genes within a pathway. We show that the Lancaster procedure is optimal in Bahadur efficiency among all combined p-value methods. The Bahadur efficiency,[Formula: see text], compares sample sizes among different statistical tests when signals become sparse in sequencing data, i.e. ε →0. The optimal Bahadur efficiency ensures that the Lancaster procedure asymptotically requires a minimal sample size to detect sparse signals ([Formula: see text]). The Lancaster procedure can also be applied to meta-analysis. Extensive empirical assessments of exome sequencing data show that the proposed method outperforms Gene Set Enrichment Analysis (GSEA). We applied the competitive Lancaster procedure to meta-analysis data generated by the Global Lipids Genetics Consortium to identify pathways significantly associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, and total cholesterol.
Comparing multiple imputation methods for systematically missing subject-level data.
Kline, David; Andridge, Rebecca; Kaizar, Eloise
2017-06-01
When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Secure method for biometric-based recognition with integrated cryptographic functions.
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.
A METHOD FOR DETERMINING THE COMPATIBILITY OF HAZARDOUS WASTES
This report describes a method for determining the compatibility of the binary combinations of hazardous wastes. The method consists of two main parts, namely: (1) the step-by-step compatibility analysis procedures, and (2) the hazardous wastes compatibility chart. The key elemen...
The index-flood and the GRADEX methods combination for flood frequency analysis.
NASA Astrophysics Data System (ADS)
Fuentes, Diana; Di Baldassarre, Giuliano; Quesada, Beatriz; Xu, Chong-Yu; Halldin, Sven; Beven, Keith
2017-04-01
Flood frequency analysis is used in many applications, including flood risk management, design of hydraulic structures, and urban planning. However, such analysis requires of long series of observed discharge data which are often not available in many basins around the world. In this study, we tested the usefulness of combining regional discharge and local precipitation data to estimate the event flood volume frequency curve for 63 catchments in Mexico, Central America and the Caribbean. This was achieved by combining two existing flood frequency analysis methods, the regionalization index-flood approach with the GRADEX method. For up to 10-years return period, similar shape of the scaled flood frequency curve for catchments with similar flood behaviour was assumed from the index-flood approach. For return periods larger than 10-years the probability distribution of rainfall and discharge volumes were assumed to be asymptotically and exponential-type functions with the same scale parameter from the GRADEX method. Results showed that if the mean annual flood (MAF), used as index-flood, is known, the index-flood approach performed well for up to 10 years return periods, resulting in 25% mean relative error in prediction. For larger return periods the prediction capability decreased but could be improved by the use of the GRADEX method. As the MAF is unknown at ungauged and short-period measured basins, we tested predicting the MAF using catchments climate-physical characteristics, and discharge statistics, the latter when observations were available for only 8 years. Only the use of discharge statistics resulted in acceptable predictions.
A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research
ERIC Educational Resources Information Center
Rohlfing, Ingo; Schneider, Carsten Q.
2018-01-01
The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…
Simplified methods for evaluating road prism stability
William J. Elliot; Mark Ballerini; David Hall
2003-01-01
Mass failure is one of the most common failures of low-volume roads in mountainous terrain. Current methods for evaluating stability of these roads require a geotechnical specialist. A stability analysis program, XSTABL, was used to estimate the stability of 3,696 combinations of road geometry, soil, and groundwater conditions. A sensitivity analysis was carried out to...
USDA-ARS?s Scientific Manuscript database
Two simple fingerprinting methods, flow-injection UV spectroscopy (FIUV) and 1H nuclear magnetic resonance (NMR), for discrimination of Aurantii FructusImmaturus and Fructus Poniciri TrifoliataeImmaturususing were described. Both methods were combined with partial least-squares discriminant analysis...
Learn from every mistake! Hierarchical information combination in astronomy
NASA Astrophysics Data System (ADS)
Süveges, Maria; Fotopoulou, Sotiria; Coupon, Jean; Paltani, Stéphane; Eyer, Laurent; Rimoldini, Lorenzo
2017-06-01
Throughout the processing and analysis of survey data, a ubiquitous issue nowadays is that we are spoilt for choice when we need to select a methodology for some of its steps. The alternative methods usually fail and excel in different data regions, and have various advantages and drawbacks, so a combination that unites the strengths of all while suppressing the weaknesses is desirable. We propose to use a two-level hierarchy of learners. Its first level consists of training and applying the possible base methods on the first part of a known set. At the second level, we feed the output probability distributions from all base methods to a second learner trained on the remaining known objects. Using classification of variable stars and photometric redshift estimation as examples, we show that the hierarchical combination is capable of achieving general improvement over averaging-type combination methods, correcting systematics present in all base methods, is easy to train and apply, and thus, it is a promising tool in the astronomical ``Big Data'' era.
The Empirical Review of Meta-Analysis Published in Korea
ERIC Educational Resources Information Center
Park, Sunyoung; Hong, Sehee
2016-01-01
Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…
41 CFR 60-2.12 - Job group analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Job group analysis. 60-2... group analysis. (a) Purpose: A job group analysis is a method of combining job titles within the... employed. (b) In the job group analysis, jobs at the establishment with similar content, wage rates, and...
NASA Technical Reports Server (NTRS)
Kradinov, V.; Madenci, E.; Ambur, D. R.
2004-01-01
Although two-dimensional methods provide accurate predictions of contact stresses and bolt load distribution in bolted composite joints with multiple bolts, they fail to capture the effect of thickness on the strength prediction. Typically, the plies close to the interface of laminates are expected to be the most highly loaded, due to bolt deformation, and they are usually the first to fail. This study presents an analysis method to account for the variation of stresses in the thickness direction by augmenting a two-dimensional analysis with a one-dimensional through the thickness analysis. The two-dimensional in-plane solution method based on the combined complex potential and variational formulation satisfies the equilibrium equations exactly, and satisfies the boundary conditions and constraints by minimizing the total potential. Under general loading conditions, this method addresses multiple bolt configurations without requiring symmetry conditions while accounting for the contact phenomenon and the interaction among the bolts explicitly. The through-the-thickness analysis is based on the model utilizing a beam on an elastic foundation. The bolt, represented as a short beam while accounting for bending and shear deformations, rests on springs, where the spring coefficients represent the resistance of the composite laminate to bolt deformation. The combined in-plane and through-the-thickness analysis produces the bolt/hole displacement in the thickness direction, as well as the stress state in each ply. The initial ply failure predicted by applying the average stress criterion is followed by a simple progressive failure. Application of the model is demonstrated by considering single- and double-lap joints of metal plates bolted to composite laminates.
NASA Astrophysics Data System (ADS)
Rohaeti, Eti; Rafi, Mohamad; Syafitri, Utami Dyah; Heryanto, Rudi
2015-02-01
Turmeric (Curcuma longa), java turmeric (Curcuma xanthorrhiza) and cassumunar ginger (Zingiber cassumunar) are widely used in traditional Indonesian medicines (jamu). They have similar color for their rhizome and possess some similar uses, so it is possible to substitute one for the other. The identification and discrimination of these closely-related plants is a crucial task to ensure the quality of the raw materials. Therefore, an analytical method which is rapid, simple and accurate for discriminating these species using Fourier transform infrared spectroscopy (FTIR) combined with some chemometrics methods was developed. FTIR spectra were acquired in the mid-IR region (4000-400 cm-1). Standard normal variate, first and second order derivative spectra were compared for the spectral data. Principal component analysis (PCA) and canonical variate analysis (CVA) were used for the classification of the three species. Samples could be discriminated by visual analysis of the FTIR spectra by using their marker bands. Discrimination of the three species was also possible through the combination of the pre-processed FTIR spectra with PCA and CVA, in which CVA gave clearer discrimination. Subsequently, the developed method could be used for the identification and discrimination of the three closely-related plant species.
Kepekci Tekkeli, Serife Evrim
2013-01-01
A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v) was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1-18.5 μ g/mL, 0.4-25.6 μ g/mL, 0.3-15.5 μ g/mL, and 0.3-22 μ g/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME) ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.
Improved Design Formulae for Buckling of Orthotropic Plates under Combined Loading
NASA Technical Reports Server (NTRS)
Weaver, Paul M.; Nemeth, Michael P.
2008-01-01
Simple, accurate buckling interaction formulae are presented for long orthotropic plates with either simply supported or clamped longitudinal edges and under combined loading that are suitable for design studies. The loads include 1) combined uniaxial compression (or tension) and shear, 2) combined pure inplane bending and 3) shear and combined uniaxial compression (or tension) and pure inplane bending. The interaction formulae are the results of detailed regression analysis of buckling data obtained from a very accurate Rayleigh-Ritz method.
Zhou, Fei; Zhao, Yajing; Peng, Jiyu; Jiang, Yirong; Li, Maiquan; Jiang, Yuan; Lu, Baiyi
2017-07-01
Osmanthus fragrans flowers are used as folk medicine and additives for teas, beverages and foods. The metabolites of O. fragrans flowers from different geographical origins were inconsistent in some extent. Chromatography and mass spectrometry combined with multivariable analysis methods provides an approach for discriminating the origin of O. fragrans flowers. To discriminate the Osmanthus fragrans var. thunbergii flowers from different origins with the identified metabolites. GC-MS and UPLC-PDA were conducted to analyse the metabolites in O. fragrans var. thunbergii flowers (in total 150 samples). Principal component analysis (PCA), soft independent modelling of class analogy analysis (SIMCA) and random forest (RF) analysis were applied to group the GC-MS and UPLC-PDA data. GC-MS identified 32 compounds common to all samples while UPLC-PDA/QTOF-MS identified 16 common compounds. PCA of the UPLC-PDA data generated a better clustering than PCA of the GC-MS data. Ten metabolites (six from GC-MS and four from UPLC-PDA) were selected as effective compounds for discrimination by PCA loadings. SIMCA and RF analysis were used to build classification models, and the RF model, based on the four effective compounds (caffeic acid derivative, acteoside, ligustroside and compound 15), yielded better results with the classification rate of 100% in the calibration set and 97.8% in the prediction set. GC-MS and UPLC-PDA combined with multivariable analysis methods can discriminate the origin of Osmanthus fragrans var. thunbergii flowers. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Fault feature analysis of cracked gear based on LOD and analytical-FE method
NASA Astrophysics Data System (ADS)
Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng
2018-01-01
At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.
Completely non-destructive elemental analysis of bulky samples by PGAA
NASA Astrophysics Data System (ADS)
Oura, Y.; Nakahara, H.; Sueki, K.; Sato, W.; Saito, A.; Tomizawa, T.; Nishikawa, T.
1999-01-01
NBAA (neutron beam activation analysis), which is a combination of PGAA and INAA by a single neutron irradiation, using an internal monostandard method is proposed as a very unique and promising method for the elemental analysis of voluminous and invaluable archaeological samples which do not allow even a scrape of the surface. It was applied to chinawares, Sueki ware, and bronze mirrors, and proved to be a very effective method for nondestructive analysis of not only major elements but also some minor elements such as boron that help solve archaeological problems of ears and sites of their production.
NASA Astrophysics Data System (ADS)
Cenglin, Yao
The car sales enterprises could continuously boost sales and expand customer groups, an important method is to enhance the customer satisfaction. The customer satisfaction of car sales enterprises (4S enterprises) depends on many factors. By using the grey relational analysis method, we could perfectly combine various factors in terms of customer satisfaction. And through the vertical contrast, car sales enterprises could find specific factors which will improve customer satisfaction, thereby increase sales volume and benefits. Gray relational analysis method has become a kind of good method and means to analyze and evaluate the enterprises.
He, Xiyang; Zhang, Xiaohong; Tang, Long; Liu, Wanke
2015-12-22
Many applications, such as marine navigation, land vehicles location, etc., require real time precise positioning under medium or long baseline conditions. In this contribution, we develop a model of real-time kinematic decimeter-level positioning with BeiDou Navigation Satellite System (BDS) triple-frequency signals over medium distances. The ambiguities of two extra-wide-lane (EWL) combinations are fixed first, and then a wide lane (WL) combination is reformed based on the two EWL combinations for positioning. Theoretical analysis and empirical analysis is given of the ambiguity fixing rate and the positioning accuracy of the presented method. The results indicate that the ambiguity fixing rate can be up to more than 98% when using BDS medium baseline observations, which is much higher than that of dual-frequency Hatch-Melbourne-Wübbena (HMW) method. As for positioning accuracy, decimeter level accuracy can be achieved with this method, which is comparable to that of carrier-smoothed code differential positioning method. Signal interruption simulation experiment indicates that the proposed method can realize fast high-precision positioning whereas the carrier-smoothed code differential positioning method needs several hundreds of seconds for obtaining high precision results. We can conclude that a relatively high accuracy and high fixing rate can be achieved for triple-frequency WL method with single-epoch observations, displaying significant advantage comparing to traditional carrier-smoothed code differential positioning method.
Rice, J P; Saccone, N L; Corbett, J
2001-01-01
The lod score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential, so that pedigrees or lod curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders, where the maximum lod score (MLS) statistic shares some of the advantages of the traditional lod score approach but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the lod score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.
Biomedical discovery acceleration, with applications to craniofacial development.
Leach, Sonia M; Tipney, Hannah; Feng, Weiguo; Baumgartner, William A; Kasliwal, Priyanka; Schuyler, Ronald P; Williams, Trevor; Spritz, Richard A; Hunter, Lawrence
2009-03-01
The profusion of high-throughput instruments and the explosion of new results in the scientific literature, particularly in molecular biomedicine, is both a blessing and a curse to the bench researcher. Even knowledgeable and experienced scientists can benefit from computational tools that help navigate this vast and rapidly evolving terrain. In this paper, we describe a novel computational approach to this challenge, a knowledge-based system that combines reading, reasoning, and reporting methods to facilitate analysis of experimental data. Reading methods extract information from external resources, either by parsing structured data or using biomedical language processing to extract information from unstructured data, and track knowledge provenance. Reasoning methods enrich the knowledge that results from reading by, for example, noting two genes that are annotated to the same ontology term or database entry. Reasoning is also used to combine all sources into a knowledge network that represents the integration of all sorts of relationships between a pair of genes, and to calculate a combined reliability score. Reporting methods combine the knowledge network with a congruent network constructed from experimental data and visualize the combined network in a tool that facilitates the knowledge-based analysis of that data. An implementation of this approach, called the Hanalyzer, is demonstrated on a large-scale gene expression array dataset relevant to craniofacial development. The use of the tool was critical in the creation of hypotheses regarding the roles of four genes never previously characterized as involved in craniofacial development; each of these hypotheses was validated by further experimental work.
Profitability analysis of KINGLONG nearly 5 years
NASA Astrophysics Data System (ADS)
Zhang, Mei; Wen, Jinghua
2017-08-01
Profitability analysis for measuring business performance and forecast its prospects play an important role. In this paper, the research instance King Long Motor in understanding the basic theory on the basis of financial management, to take a combination of theory and data analysis methods, combined with a measure of profitability related indicators of King Long Motor company’s profitability do a specific analysis to identify factors constraining the profitability of Kinglong company exists and the motivation to improve profitability, which made recommendations to improve the profitability of Kinglong car company to promote the company’s future can be better and faster development.)
NASA Astrophysics Data System (ADS)
Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku
2014-03-01
In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.
Heuristics to Facilitate Understanding of Discriminant Analysis.
ERIC Educational Resources Information Center
Van Epps, Pamela D.
This paper discusses the principles underlying discriminant analysis and constructs a simulated data set to illustrate its methods. Discriminant analysis is a multivariate technique for identifying the best combination of variables to maximally discriminate between groups. Discriminant functions are established on existing groups and used to…
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.
Song, Chi; Tseng, George C
2014-01-01
Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.
Polavarapu, Prasad L.; Donahue, Emily A.; Shanmugam, Ganesh; Scalmani, Giovanni; Hawkins, Edward K.; Rizzo, Carmelo; Ibnusaud, Ibrahim; Thomas, Grace; Habel, Deenamma; Sebastian, Dellamol
2013-01-01
Electronic circular dichroism (ECD), optical rotatory dispersion (ORD), and vibrational circular dichroism (VCD) spectra of hibiscus acid dimethyl ester have been measured and analyzed in combination with quantum chemical calculations of corresponding spectra. These results, along with those reported previously for garcinia acid dimethyl ester, reveal that none of these three (ECD, ORD, or VCD) spectroscopic methods, in isolation, can unequivocally establish the absolute configurations of diastereomers. This deficiency is eliminated when a combined spectral analysis of either ECD and VCD or ORD and VCD methods is used. It is also found that the ambiguities in the assignment of absolute configurations of diastereomers may also be overcome when unpolarized vibrational absorption is included in the spectral analysis. PMID:21568330
NASA Astrophysics Data System (ADS)
Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar; Mohammadi, Mohammad
2017-05-01
A combination of Finite Difference Time Domain (FDTD) and Monte Carlo (MC) methods is proposed for simulation and analysis of ZnO microscintillators grown in polycarbonate membrane. A planar 10 keV X-ray source irradiating the detector is simulated by MC method, which provides the amount of absorbed X-ray energy in the assembly. The transport of generated UV scintillation light and its propagation in the detector was studied by the FDTD method. Detector responses to different probable scintillation sites and under different energies of X-ray source from 10 to 25 keV are reported. Finally, the tapered geometry for the scintillators is proposed, which shows enhanced spatial resolution in comparison to cylindrical geometry for imaging applications.
How to determine spiral bevel gear tooth geometry for finite element analysis
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.; Litvin, Faydor L.
1991-01-01
An analytical method was developed to determine gear tooth surface coordinates of face milled spiral bevel gears. The method combines the basic gear design parameters with the kinematical aspects for spiral bevel gear manufacturing. A computer program was developed to calculate the surface coordinates. From this data a 3-D model for finite element analysis can be determined. Development of the modeling method and an example case are presented.
Antibodies as means for selective mass spectrometry.
Boström, Tove; Takanen, Jenny Ottosson; Hober, Sophia
2016-05-15
For protein analysis of biological samples, two major strategies are used today; mass spectrometry (MS) and antibody-based methods. Each strategy offers advantages and drawbacks. However, combining the two using an immunoenrichment step with MS analysis brings together the benefits of each method resulting in increased sensitivity, faster analysis and possibility of higher degrees of multiplexing. The immunoenrichment can be performed either on protein or peptide level and quantification standards can be added in order to enable determination of the absolute protein concentration in the sample. The combination of immunoenrichment and MS holds great promise for the future in both proteomics and clinical diagnostics. This review describes different setups of immunoenrichment coupled to mass spectrometry and how these can be utilized in various applications. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Smallwood, Jeremy; Swenson, David E.
2011-06-01
Evaluation of electrostatic performance of footwear and flooring in combination is necessary in applications such as electrostatic discharge (ESD) control in electronics manufacture, evaluation of equipment for avoidance of factory process electrostatic ignition risks and avoidance of electrostatic shocks to personnel in working environments. Typical standards use a walking test in which the voltage produced on a subject is evaluated by identification and measurement of the magnitude of the 5 highest "peaks" and "valleys" of the recorded voltage waveform. This method does not lend itself to effective analysis of the risk that the voltage will exceed a hazard threshold. This paper shows the advantages of voltage probability analysis and recommends that the method is adopted for use in future standards.
Nonlinear analysis of structures. [within framework of finite element method
NASA Technical Reports Server (NTRS)
Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.
1974-01-01
The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.
Combined linear theory/impact theory method for analysis and design of high speed configurations
NASA Technical Reports Server (NTRS)
Brooke, D.; Vondrasek, D. V.
1980-01-01
Pressure distributions on a wing body at Mach 4.63 are calculated. The combined theory is shown to give improved predictions over either linear theory or impact theory alone. The combined theory is also applied in the inverse design mode to calculate optimum camber slopes at Mach 4.63. Comparisons with optimum camber slopes obtained from unmodified linear theory show large differences. Analysis of the results indicate that the combined theory correctly predicts the effect of thickness on the loading distributions at high Mach numbers, and that finite thickness wings optimized at high Mach numbers using unmodified linear theory will not achieve the minimum drag characteristics for which they are designed.
RooStatsCms: A tool for analysis modelling, combination and statistical studies
NASA Astrophysics Data System (ADS)
Piparo, D.; Schott, G.; Quast, G.
2010-04-01
RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.
NASA Technical Reports Server (NTRS)
Brooke, D.; Vondrasek, D. V.
1978-01-01
The aerodynamic influence coefficients calculated using an existing linear theory program were used to modify the pressures calculated using impact theory. Application of the combined approach to several wing-alone configurations shows that the combined approach gives improved predictions of the local pressure and loadings over either linear theory alone or impact theory alone. The approach not only removes most of the short-comings of the individual methods, as applied in the Mach 4 to 8 range, but also provides the basis for an inverse design procedure applicable to high speed configurations.
Transverse vibrations of non-uniform beams. [combined finite element and Rayleigh-Ritz methods
NASA Technical Reports Server (NTRS)
Klein, L.
1974-01-01
The free vibrations of elastic beams with nonuniform characteristics are investigated theoretically by a new method. The new method is seen to combine the advantages of a finite element approach and of a Rayleigh-Ritz analysis. Comparison with the known analytical results for uniform beams shows good convergence of the method for natural frequencies and modes. For internal shear forces and bending moments, the rate of convergence is less rapid. Results from experiments conducted with a cantilevered helicopter blade with strong nonuniformities and also from alternative theoretical methods, indicate that the theory adequately predicts natural frequencies and mode shapes. General guidelines for efficient use of the method are presented.
NASA Technical Reports Server (NTRS)
Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.
1986-01-01
A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.
Analysis of random signal combinations for spacecraft pointing stability
NASA Technical Reports Server (NTRS)
Howell, L.
1983-01-01
Methods for obtaining the probability density function of random signal combustions are discussed. These methods provide a realistic criteria for the design of control systems subjected to external noise with several important applications for aerospace problems.
Representing distributed cognition in complex systems: how a submarine returns to periscope depth.
Stanton, Neville A
2014-01-01
This paper presents the Event Analysis of Systemic Teamwork (EAST) method as a means of modelling distributed cognition in systems. The method comprises three network models (i.e. task, social and information) and their combination. This method was applied to the interactions between the sound room and control room in a submarine, following the activities of returning the submarine to periscope depth. This paper demonstrates three main developments in EAST. First, building the network models directly, without reference to the intervening methods. Second, the application of analysis metrics to all three networks. Third, the combination of the aforementioned networks in different ways to gain a broader understanding of the distributed cognition. Analyses have shown that EAST can be used to gain both qualitative and quantitative insights into distributed cognition. Future research should focus on the analyses of network resilience and modelling alternative versions of a system.
Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong
2018-01-01
Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drukker, Karen, E-mail: kdrukker@uchicago.edu; Giger, Maryellen L.; Li, Hui
2014-03-15
Purpose: To investigate whether biologic image composition of mammographic lesions can improve upon existing mammographic quantitative image analysis (QIA) in estimating the probability of malignancy. Methods: The study population consisted of 45 breast lesions imaged with dual-energy mammography prior to breast biopsy with final diagnosis resulting in 10 invasive ductal carcinomas, 5 ductal carcinomain situ, 11 fibroadenomas, and 19 other benign diagnoses. Analysis was threefold: (1) The raw low-energy mammographic images were analyzed with an established in-house QIA method, “QIA alone,” (2) the three-compartment breast (3CB) composition measure—derived from the dual-energy mammography—of water, lipid, and protein thickness were assessed, “3CBmore » alone”, and (3) information from QIA and 3CB was combined, “QIA + 3CB.” Analysis was initiated from radiologist-indicated lesion centers and was otherwise fully automated. Steps of the QIA and 3CB methods were lesion segmentation, characterization, and subsequent classification for malignancy in leave-one-case-out cross-validation. Performance assessment included box plots, Bland–Altman plots, and Receiver Operating Characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) for distinguishing between benign and malignant lesions (invasive and DCIS) was 0.81 (standard error 0.07) for the “QIA alone” method, 0.72 (0.07) for “3CB alone” method, and 0.86 (0.04) for “QIA+3CB” combined. The difference in AUC was 0.043 between “QIA + 3CB” and “QIA alone” but failed to reach statistical significance (95% confidence interval [–0.17 to + 0.26]). Conclusions: In this pilot study analyzing the new 3CB imaging modality, knowledge of the composition of breast lesions and their periphery appeared additive in combination with existing mammographic QIA methods for the distinction between different benign and malignant lesion types.« less
Atmospheric pollution measurement by optical cross correlation methods - A concept
NASA Technical Reports Server (NTRS)
Fisher, M. J.; Krause, F. R.
1971-01-01
Method combines standard spectroscopy with statistical cross correlation analysis of two narrow light beams for remote sensing to detect foreign matter of given particulate size and consistency. Method is applicable in studies of generation and motion of clouds, nuclear debris, ozone, and radiation belts.
Towards the identification of plant and animal binders on Australian stone knives.
Blee, Alisa J; Walshe, Keryn; Pring, Allan; Quinton, Jamie S; Lenehan, Claire E
2010-07-15
There is limited information regarding the nature of plant and animal residues used as adhesives, fixatives and pigments found on Australian Aboriginal artefacts. This paper reports the use of FTIR in combination with the chemometric tools principal component analysis (PCA) and hierarchical clustering (HC) for the analysis and identification of Australian plant and animal fixatives on Australian stone artefacts. Ten different plant and animal residues were able to be discriminated from each other at a species level by combining FTIR spectroscopy with the chemometric data analysis methods, principal component analysis (PCA) and hierarchical clustering (HC). Application of this method to residues from three broken stone knives from the collections of the South Australian Museum indicated that two of the handles of knives were likely to have contained beeswax as the fixative whilst Spinifex resin was the probable binder on the third. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tan, Shanjuan; Feng, Feifei; Wu, Yongjun; Wu, Yiming
To develop a computer-aided diagnostic scheme by using an artificial neural network (ANN) combined with tumor markers for diagnosis of hepatic carcinoma (HCC) as a clinical assistant method. 140 serum samples (50 malignant, 40 benign and 50 normal) were analyzed for α-fetoprotein (AFP), carbohydrate antigen 125 (CA125), carcinoembryonic antigen (CEA), sialic acid (SA) and calcium (Ca). The five tumor marker values were then used as ANN inputs data. The result of ANN was compared with that of discriminant analysis by receiver operating characteristic (ROC) curve (AUC) analysis. The diagnostic accuracy of ANN and discriminant analysis among all samples of the test group was 95.5% and 79.3%, respectively. Analysis of multiple tumor markers based on ANN may be a better choice than the traditional statistical methods for differentiating HCC from benign or normal.
Cui, Xueliang; Chen, Hui; Rui, Yunfeng; Niu, Yang; Li, He
2018-01-01
Objectives Two-stage open reduction and internal fixation (ORIF) and limited internal fixation combined with external fixation (LIFEF) are two widely used methods to treat Pilon injury. However, which method is superior to the other remains controversial. This meta-analysis was performed to quantitatively compare two-stage ORIF and LIFEF and clarify which method is better with respect to postoperative complications in the treatment of tibial Pilon fractures. Methods We conducted a meta-analysis to quantitatively compare the postoperative complications between two-stage ORIF and LIFEF. Eight studies involving 360 fractures in 359 patients were included in the meta-analysis. Results The two-stage ORIF group had a significantly lower risk of superficial infection, nonunion, and bone healing problems than the LIFEF group. However, no significant differences in deep infection, delayed union, malunion, arthritis symptoms, or chronic osteomyelitis were found between the two groups. Conclusion Two-stage ORIF was associated with a lower risk of postoperative complications with respect to superficial infection, nonunion, and bone healing problems than LIFEF for tibial Pilon fractures. Level of evidence 2.
Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.
Chmelnitsky, Elly G; Ferguson, Steven H
2012-06-01
Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.
System of Systems Analytic Workbench - 2017
2017-08-31
and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111
41 CFR 60-2.12 - Job group analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 1 2013-07-01 2013-07-01 false Job group analysis. 60-2... 2-AFFIRMATIVE ACTION PROGRAMS Purpose and Contents of Affirmative Action Programs § 60-2.12 Job group analysis. (a) Purpose: A job group analysis is a method of combining job titles within the...
41 CFR 60-2.12 - Job group analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 41 Public Contracts and Property Management 1 2012-07-01 2009-07-01 true Job group analysis. 60-2... 2-AFFIRMATIVE ACTION PROGRAMS Purpose and Contents of Affirmative Action Programs § 60-2.12 Job group analysis. (a) Purpose: A job group analysis is a method of combining job titles within the...
41 CFR 60-2.12 - Job group analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 41 Public Contracts and Property Management 1 2014-07-01 2014-07-01 false Job group analysis. 60-2... 2-AFFIRMATIVE ACTION PROGRAMS Purpose and Contents of Affirmative Action Programs § 60-2.12 Job group analysis. (a) Purpose: A job group analysis is a method of combining job titles within the...
On-line/on-site analysis of heavy metals in water and soils by laser induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Meng, Deshuo; Zhao, Nanjing; Wang, Yuanyuan; Ma, Mingjun; Fang, Li; Gu, Yanhong; Jia, Yao; Liu, Jianguo
2017-11-01
The enrichment method of heavy metal in water with graphite and aluminum electrode was studied, and combined with plasma restraint device for improving the sensitivity of detection and reducing the limit of detection (LOD) of elements. For aluminum electrode enrichment, the LODs of Cd, Pb and Ni can be as low as several ppb. For graphite enrichment, the measurement time can be less than 3 min. The results showed that the graphite enrichment and aluminum electrode enrichment method can effectively improve the LIBS detection ability. The graphite enrichment method combined with plasma spatial confinement is more suitable for on-line monitoring of industrial waste water, the aluminum electrode enrichment method can be used for trace heavy metal detection in water. A LIBS method and device for soil heavy metals analysis was also developed, and a mobile LIBS system was tested in outfield. The measurement results deduced from LIBS and ICP-MS had a good consistency. The results provided an important application support for rapid and on-site monitoring of heavy metals in soil. (Left: the mobile LIBS system for analysis of heavy metals in soils. Top right: the spatial confinement device. Bottom right: automatic graphite enrichment device for on0line analysis of heavy metals in water).
Enriching Planning through Industry Analysis
ERIC Educational Resources Information Center
Martinez, Mario; Wolverton, Mimi
2009-01-01
Strategic planning is an important tool, but the sole dependence on it across departments and campuses has resulted in the underutilization of equally important methods of analysis. The evolution of higher and postsecondary education necessitates a systemic industry analysis, as the combination of new providers and delivery mechanisms and changing…
Chen, Yun; Yang, Hui
2013-01-01
Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.
NASA Astrophysics Data System (ADS)
Fujimoto, Kazuhiro J.
2012-07-01
A transition-density-fragment interaction (TDFI) combined with a transfer integral (TI) method is proposed. The TDFI method was previously developed for describing electronic Coulomb interaction, which was applied to excitation-energy transfer (EET) [K. J. Fujimoto and S. Hayashi, J. Am. Chem. Soc. 131, 14152 (2009)] and exciton-coupled circular dichroism spectra [K. J. Fujimoto, J. Chem. Phys. 133, 124101 (2010)]. In the present study, the TDFI method is extended to the exchange interaction, and hence it is combined with the TI method for applying to the EET via charge-transfer (CT) states. In this scheme, the overlap correction is also taken into account. To check the TDFI-TI accuracy, several test calculations are performed to an ethylene dimer. As a result, the TDFI-TI method gives a much improved description of the electronic coupling, compared with the previous TDFI method. Based on the successful description of the electronic coupling, the decomposition analysis is also performed with the TDFI-TI method. The present analysis clearly shows a large contribution from the Coulomb interaction in most of the cases, and a significant influence of the CT states at the small separation. In addition, the exchange interaction is found to be small in this system. The present approach is useful for analyzing and understanding the mechanism of EET.
Proper orthogonal decomposition-based spectral higher-order stochastic estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baars, Woutijn J., E-mail: wbaars@unimelb.edu.au; Tinney, Charles E.
A unique routine, capable of identifying both linear and higher-order coherence in multiple-input/output systems, is presented. The technique combines two well-established methods: Proper Orthogonal Decomposition (POD) and Higher-Order Spectra Analysis. The latter of these is based on known methods for characterizing nonlinear systems by way of Volterra series. In that, both linear and higher-order kernels are formed to quantify the spectral (nonlinear) transfer of energy between the system's input and output. This reduces essentially to spectral Linear Stochastic Estimation when only first-order terms are considered, and is therefore presented in the context of stochastic estimation as spectral Higher-Order Stochastic Estimationmore » (HOSE). The trade-off to seeking higher-order transfer kernels is that the increased complexity restricts the analysis to single-input/output systems. Low-dimensional (POD-based) analysis techniques are inserted to alleviate this void as POD coefficients represent the dynamics of the spatial structures (modes) of a multi-degree-of-freedom system. The mathematical framework behind this POD-based HOSE method is first described. The method is then tested in the context of jet aeroacoustics by modeling acoustically efficient large-scale instabilities as combinations of wave packets. The growth, saturation, and decay of these spatially convecting wave packets are shown to couple both linearly and nonlinearly in the near-field to produce waveforms that propagate acoustically to the far-field for different frequency combinations.« less
Time-frequency analysis of band-limited EEG with BMFLC and Kalman filter for BCI applications
2013-01-01
Background Time-Frequency analysis of electroencephalogram (EEG) during different mental tasks received significant attention. As EEG is non-stationary, time-frequency analysis is essential to analyze brain states during different mental tasks. Further, the time-frequency information of EEG signal can be used as a feature for classification in brain-computer interface (BCI) applications. Methods To accurately model the EEG, band-limited multiple Fourier linear combiner (BMFLC), a linear combination of truncated multiple Fourier series models is employed. A state-space model for BMFLC in combination with Kalman filter/smoother is developed to obtain accurate adaptive estimation. By virtue of construction, BMFLC with Kalman filter/smoother provides accurate time-frequency decomposition of the bandlimited signal. Results The proposed method is computationally fast and is suitable for real-time BCI applications. To evaluate the proposed algorithm, a comparison with short-time Fourier transform (STFT) and continuous wavelet transform (CWT) for both synthesized and real EEG data is performed in this paper. The proposed method is applied to BCI Competition data IV for ERD detection in comparison with existing methods. Conclusions Results show that the proposed algorithm can provide optimal time-frequency resolution as compared to STFT and CWT. For ERD detection, BMFLC-KF outperforms STFT and BMFLC-KS in real-time applicability with low computational requirement. PMID:24274109
Vivas, M; Silveira, S F; Viana, A P; Amaral, A T; Cardoso, D L; Pereira, M G
2014-07-02
Diallel crossing methods provide information regarding the performance of genitors between themselves and their hybrid combinations. However, with a large number of parents, the number of hybrid combinations that can be obtained and evaluated become limited. One option regarding the number of parents involved is the adoption of circulant diallels. However, information is lacking regarding diallel analysis using mixed models. This study aimed to evaluate the efficacy of the method of linear mixed models to estimate, for variable resistance to foliar fungal diseases, components of general and specific combining ability in a circulant table with different s values. Subsequently, 50 diallels were simulated for each s value, and the correlations and estimates of the combining abilities of the different diallel combinations were analyzed. The circulant diallel method using mixed modeling was effective in the classification of genitors regarding their combining abilities relative to the complete diallels. The numbers of crosses in which each genitor(s) will compose the circulant diallel and the estimated heritability affect the combining ability estimates. With three crosses per parent, it is possible to obtain good concordance (correlation above 0.8) between the combining ability estimates.
On the complexity of a combined homotopy interior method for convex programming
NASA Astrophysics Data System (ADS)
Yu, Bo; Xu, Qing; Feng, Guochen
2007-03-01
In [G.C. Feng, Z.H. Lin, B. Yu, Existence of an interior pathway to a Karush-Kuhn-Tucker point of a nonconvex programming problem, Nonlinear Anal. 32 (1998) 761-768; G.C. Feng, B. Yu, Combined homotopy interior point method for nonlinear programming problems, in: H. Fujita, M. Yamaguti (Eds.), Advances in Numerical Mathematics, Proceedings of the Second Japan-China Seminar on Numerical Mathematics, Lecture Notes in Numerical and Applied Analysis, vol. 14, Kinokuniya, Tokyo, 1995, pp. 9-16; Z.H. Lin, B. Yu, G.C. Feng, A combined homotopy interior point method for convex programming problem, Appl. Math. Comput. 84 (1997) 193-211.], a combined homotopy was constructed for solving non-convex programming and convex programming with weaker conditions, without assuming the logarithmic barrier function to be strictly convex and the solution set to be bounded. It was proven that a smooth interior path from an interior point of the feasible set to a K-K-T point of the problem exists. This shows that combined homotopy interior point methods can solve the problem that commonly used interior point methods cannot solveE However, so far, there is no result on its complexity, even for linear programming. The main difficulty is that the objective function is not monotonically decreasing on the combined homotopy path. In this paper, by taking a piecewise technique, under commonly used conditions, polynomiality of a combined homotopy interior point method is given for convex nonlinear programming.
NASA Astrophysics Data System (ADS)
Kang, Kwang-Song; Hu, Nai-Lian; Sin, Chung-Sik; Rim, Song-Ho; Han, Eun-Cheol; Kim, Chol-Nam
2017-08-01
It is very important to obtain the mechanical paramerters of rock mass for excavation design, support design, slope design and stability analysis of the underground structure. In order to estimate the mechanical parameters of rock mass exactly, a new method of combining a geological strength index (GSI) system with intelligent displacment back analysis is proposed in this paper. Firstly, average spacing of joints (d) and rock mass block rating (RBR, a new quantitative factor), surface condition rating (SCR) and joint condition factor (J c) are obtained on in situ rock masses using the scanline method, and the GSI values of rock masses are obtained from a new quantitative GSI chart. A correction method of GSI value is newly introduced by considering the influence of joint orientation and groundwater on rock mass mechanical properties, and then value ranges of rock mass mechanical parameters are chosen by the Hoek-Brown failure criterion. Secondly, on the basis of the measurement result of vault settlements and horizontal convergence displacements of an in situ tunnel, optimal parameters are estimated by combination of genetic algorithm (GA) and numerical simulation analysis using FLAC3D. This method has been applied in a lead-zinc mine. By utilizing the improved GSI quantization, correction method and displacement back analysis, the mechanical parameters of the ore body, hanging wall and footwall rock mass were determined, so that reliable foundations were provided for mining design and stability analysis.
Applying Meta-Analysis to Structural Equation Modeling
ERIC Educational Resources Information Center
Hedges, Larry V.
2016-01-01
Structural equation models play an important role in the social sciences. Consequently, there is an increasing use of meta-analytic methods to combine evidence from studies that estimate the parameters of structural equation models. Two approaches are used to combine evidence from structural equation models: A direct approach that combines…
USDA-ARS?s Scientific Manuscript database
Rice seeds of the temperate japonica cultivar Kitaake were mutagenized with sodium azide alone and in combination with methyl nitrosourea. Using the reduced representation sequencing method Restriction Enzyme Sequence Comparative Analysis (RESCAN), the mutation densities, types and local sequence co...
Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min
2012-01-01
DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226
NASA Technical Reports Server (NTRS)
Reddy, C. J.; Deshpande, M. D.; Fralick, D. T.; Cockrell, C. R.; Beck, F. B.
1996-01-01
Radiation pattern prediction analysis of elliptically polarized cavity-backed aperture antennas in a finite ground plane is performed using a combined Finite Element Method/Method of Moments/Geometrical Theory of Diffraction (FEM/MoM/GTD) technique. The magnetic current on the cavity-backed aperture in an infinite ground plane is calculated using the combined FEM/MoM analysis. GTD, including the slope diffraction contribution, is used to calculate the diffracted fields caused by both soft and hard polarizations at the edges of the finite ground plane. Explicit expressions for regular diffraction coefficients and slope diffraction coefficients are presented. The slope of the incident magnetic field at the diffraction points is derived and analytical expressions are presented. Numerical results for the radiation patterns of a cavity-backed circular spiral microstrip patch antenna excited by a coaxial probe in a finite rectangular ground plane are computed and compared with experimental results.
Yoshida, T; Kondo, N; Hanifah, Y A; Hiramatsu, K
1997-01-01
We have previously reported the phenotypic characterization of methicillin-resistant Staphylococcus aureus (MRSA) clinical strains isolated in Malaya University Hospital in the period 1987 to 1989 using antibiogram, coagulase typing, plasmid profiles, and phage typing. Here, we report the analysis of the same strains with three genotyping methods; ribotyping, pulsed-field gel electrophoresis (PFGE) typing, and IS431 typing (a restriction enzyme fragment length polymorphism analysis using an IS431 probe). Ribotyping could discriminate 46 clinical MRSA strains into 5 ribotypes, PFGE typing into 22 types, and IS431 typing into 15 types. Since the differences of the three genotyping patterns from strain to strain were quite independent from one another, the combined use of the three genotyping methods could discriminate 46 strains into 39 genotypes. Thus, the powerful discriminatory ability of the combination was demonstrated.
Semantic classification of business images
NASA Astrophysics Data System (ADS)
Erol, Berna; Hull, Jonathan J.
2006-01-01
Digital cameras are becoming increasingly common for capturing information in business settings. In this paper, we describe a novel method for classifying images into the following semantic classes: document, whiteboard, business card, slide, and regular images. Our method is based on combining low-level image features, such as text color, layout, and handwriting features with high-level OCR output analysis. Several Support Vector Machine Classifiers are combined for multi-class classification of input images. The system yields 95% accuracy in classification.
Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc
2016-01-01
Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355
Evaluation of digestion methods for analysis of trace metals in mammalian tissues and NIST 1577c.
Binder, Grace A; Metcalf, Rainer; Atlas, Zachary; Daniel, Kenyon G
2018-02-15
Digestion techniques for ICP analysis have been poorly studied for biological samples. This report describes an optimized method for analysis of trace metals that can be used across a variety of sample types. Digestion methods were tested and optimized with the analysis of trace metals in cancerous as compared to normal tissue as the end goal. Anthropological, forensic, oncological and environmental research groups can employ this method reasonably cheaply and safely whilst still being able to compare between laboratories. We examined combined HNO 3 and H 2 O 2 digestion at 170 °C for human, porcine and bovine samples whether they are frozen, fresh or lyophilized powder. Little discrepancy is found between microwave digestion and PFA Teflon pressure vessels. The elements of interest (Cu, Zn, Fe and Ni) yielded consistently higher and more accurate values on standard reference material than samples heated to 75 °C or samples that utilized HNO 3 alone. Use of H 2 SO 4 does not improve homogeneity of the sample and lowers precision during ICP analysis. High temperature digestions (>165 °C) using a combination of HNO 3 and H 2 O 2 as outlined are proposed as a standard technique for all mammalian tissues, specifically, human tissues and yield greater than 300% higher values than samples digested at 75 °C regardless of the acid or acid combinations used. The proposed standardized technique is designed to accurately quantify potential discrepancies in metal loads between cancerous and healthy tissues and applies to numerous tissue studies requiring quick, effective and safe digestions. Copyright © 2017 Elsevier Inc. All rights reserved.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure-activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7-7-1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure-activity relationship model suggested is robust and satisfactory.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure–activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7−7−1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure–activity relationship model suggested is robust and satisfactory. PMID:26600858
A collocation-shooting method for solving fractional boundary value problems
NASA Astrophysics Data System (ADS)
Al-Mdallal, Qasem M.; Syam, Muhammed I.; Anwar, M. N.
2010-12-01
In this paper, we discuss the numerical solution of special class of fractional boundary value problems of order 2. The method of solution is based on a conjugating collocation and spline analysis combined with shooting method. A theoretical analysis about the existence and uniqueness of exact solution for the present class is proven. Two examples involving Bagley-Torvik equation subject to boundary conditions are also presented; numerical results illustrate the accuracy of the present scheme.
Shen, Songjie; Xu, Qianqian; Zhou, Yidong; Mao, Feng; Guan, Jinghong; Sun, Qiang
2018-05-22
There were limited data available for a head-to-head comparison of the identification rate and survival between the combined method of indocyanine green fluorescence and blue dye versus the traditional blue dye alone method for sentinel lymph node (SLN) biopsy. From January 2013 to December 2015, 523 eligible breast cancer patients were included in this nonrandomized prospective analysis. The identification rates, the number of SLNs identified, and the disease-free survival (DFS) between the two mapping methods were compared. The identification rate of SLNs was significantly higher with the combined method than that with the blue dye alone method (99.2% vs 93.3%, respectively; P < 0.001). The average number of SLNs identified per patient in the combined method group was 3.7 ± 2.4, which was more than that in the blue dye alone group (3.2 ± 1.6; P = 0.004). With a median follow-up of 29 months, 0.5% patients in the combined group, and 1.3% patients in the blue dye group had axillary recurrences. The DFS between the two groups showed no significant difference (P = 0.161). The combined method achieved a higher identification rate and lower rate of axillary recurrence compared to the blue dye alone method. © 2018 Wiley Periodicals, Inc.
Fatigue Analysis of Overhead Sign and Signal Structures
DOT National Transportation Integrated Search
1994-05-01
This report documents methods of fatigue analysis for overhead sign and signal structures. The main purpose of this report is to combine pertinent wind loading and vibration theory, fatigue damage theory, and experimental data into a useable fatigue ...
Noninvasive glucose monitoring by optical reflective and thermal emission spectroscopic measurements
NASA Astrophysics Data System (ADS)
Saetchnikov, V. A.; Tcherniavskaia, E. A.; Schiffner, G.
2005-08-01
Noninvasive method for blood glucose monitoring in cutaneous tissue based on reflective spectrometry combined with a thermal emission spectroscopy has been developed. Regression analysis, neural network algorithms and cluster analysis are used for data processing.
Lifshits, A M
1979-01-01
General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.
Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li
2015-09-01
The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
OPATs: Omnibus P-value association tests.
Chen, Chia-Wei; Yang, Hsin-Chou
2017-07-10
Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.
Counseling Workers over 40: GULHEMP, a New Approach.
ERIC Educational Resources Information Center
Meredith, Jack
This series of presentations describe a method of job counseling and placement for the middle-aged which combines pre-employment physical worker analysis with job analysis for effective matching of job requirements with worker capacities. The matching process involves these steps: (1) job analysis by an industrial engineer; (2) worker examination…
ERIC Educational Resources Information Center
Hwang, Heungsun; Montreal, Hec; Dillon, William R.; Takane, Yoshio
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
Simultaneous Two-Way Clustering of Multiple Correspondence Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Dillon, William R.
2010-01-01
A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…
Motif-Synchronization: A new method for analysis of dynamic brain networks with EEG
NASA Astrophysics Data System (ADS)
Rosário, R. S.; Cardoso, P. T.; Muñoz, M. A.; Montoya, P.; Miranda, J. G. V.
2015-12-01
The major aim of this work was to propose a new association method known as Motif-Synchronization. This method was developed to provide information about the synchronization degree and direction between two nodes of a network by counting the number of occurrences of some patterns between any two time series. The second objective of this work was to present a new methodology for the analysis of dynamic brain networks, by combining the Time-Varying Graph (TVG) method with a directional association method. We further applied the new algorithms to a set of human electroencephalogram (EEG) signals to perform a dynamic analysis of the brain functional networks (BFN).
NASA Astrophysics Data System (ADS)
Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin
2017-06-01
Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.
Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851
Methods to maximise recovery of environmental DNA from water samples.
Hinlo, Rheyda; Gleeson, Dianne; Lintermans, Mark; Furlan, Elise
2017-01-01
The environmental DNA (eDNA) method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus) as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days). This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.
Clinical Trials With Large Numbers of Variables: Important Advantages of Canonical Analysis.
Cleophas, Ton J
2016-01-01
Canonical analysis assesses the combined effects of a set of predictor variables on a set of outcome variables, but it is little used in clinical trials despite the omnipresence of multiple variables. The aim of this study was to assess the performance of canonical analysis as compared with traditional multivariate methods using multivariate analysis of covariance (MANCOVA). As an example, a simulated data file with 12 gene expression levels and 4 drug efficacy scores was used. The correlation coefficient between the 12 predictor and 4 outcome variables was 0.87 (P = 0.0001) meaning that 76% of the variability in the outcome variables was explained by the 12 covariates. Repeated testing after the removal of 5 unimportant predictor and 1 outcome variable produced virtually the same overall result. The MANCOVA identified identical unimportant variables, but it was unable to provide overall statistics. (1) Canonical analysis is remarkable, because it can handle many more variables than traditional multivariate methods such as MANCOVA can. (2) At the same time, it accounts for the relative importance of the separate variables, their interactions and differences in units. (3) Canonical analysis provides overall statistics of the effects of sets of variables, whereas traditional multivariate methods only provide the statistics of the separate variables. (4) Unlike other methods for combining the effects of multiple variables such as factor analysis/partial least squares, canonical analysis is scientifically entirely rigorous. (5) Limitations include that it is less flexible than factor analysis/partial least squares, because only 2 sets of variables are used and because multiple solutions instead of one is offered. We do hope that this article will stimulate clinical investigators to start using this remarkable method.
Combining individual participant and aggregated data in a meta-analysis with correlational studies.
Pigott, Terri; Williams, Ryan; Polanin, Joshua
2012-12-01
This paper presents methods for combining individual participant data (IPD) with aggregated study level data (AD) in a meta-analysis of correlational studies. Although medical researchers have employed IPD in a wide range of studies, only a single example exists in the social sciences. New policies at the National Science Foundation requiring grantees to submit data archiving plans may increase social scientists' access to individual level data that could be combined with traditional meta-analysis. The methods presented here extend prior work on IPD to meta-analyses using correlational studies. The examples presented illustrate the synthesis of publicly available national datasets in education with aggregated study data from a meta-analysis examining the correlation of socioeconomic status measures and academic achievement. The major benefit of the inclusion of the individual level is that both within-study and between-study interactions among moderators of effect size can be estimated. Given the potential growth in data archives in the social sciences, we should see a corresponding increase in the ability to synthesize IPD and AD in a single meta-analysis, leading to a more complete understanding of how within-study and between-study moderators relate to effect size. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
O'Halloran, Kay L.; Tan, Sabine; Pham, Duc-Son; Bateman, John; Vande Moere, Andrew
2018-01-01
This article demonstrates how a digital environment offers new opportunities for transforming qualitative data into quantitative data in order to use data mining and information visualization for mixed methods research. The digital approach to mixed methods research is illustrated by a framework which combines qualitative methods of multimodal…
ERIC Educational Resources Information Center
Davis, Eric J.; Pauls, Steve; Dick, Jonathan
2017-01-01
Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…
Pathway Distiller - multisource biological pathway consolidation
2012-01-01
Background One method to understand and evaluate an experiment that produces a large set of genes, such as a gene expression microarray analysis, is to identify overrepresentation or enrichment for biological pathways. Because pathways are able to functionally describe the set of genes, much effort has been made to collect curated biological pathways into publicly accessible databases. When combining disparate databases, highly related or redundant pathways exist, making their consolidation into pathway concepts essential. This will facilitate unbiased, comprehensive yet streamlined analysis of experiments that result in large gene sets. Methods After gene set enrichment finds representative pathways for large gene sets, pathways are consolidated into representative pathway concepts. Three complementary, but different methods of pathway consolidation are explored. Enrichment Consolidation combines the set of the pathways enriched for the signature gene list through iterative combining of enriched pathways with other pathways with similar signature gene sets; Weighted Consolidation utilizes a Protein-Protein Interaction network based gene-weighting approach that finds clusters of both enriched and non-enriched pathways limited to the experiments' resultant gene list; and finally the de novo Consolidation method uses several measurements of pathway similarity, that finds static pathway clusters independent of any given experiment. Results We demonstrate that the three consolidation methods provide unified yet different functional insights of a resultant gene set derived from a genome-wide profiling experiment. Results from the methods are presented, demonstrating their applications in biological studies and comparing with a pathway web-based framework that also combines several pathway databases. Additionally a web-based consolidation framework that encompasses all three methods discussed in this paper, Pathway Distiller (http://cbbiweb.uthscsa.edu/PathwayDistiller), is established to allow researchers access to the methods and example microarray data described in this manuscript, and the ability to analyze their own gene list by using our unique consolidation methods. Conclusions By combining several pathway systems, implementing different, but complementary pathway consolidation methods, and providing a user-friendly web-accessible tool, we have enabled users the ability to extract functional explanations of their genome wide experiments. PMID:23134636
Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato
2013-04-01
Correlative fractography is a new expression proposed here to describe a new method for the association between scanning electron microscopy (SEM) and light microscopy (LM) for the qualitative and quantitative analysis of fracture surfaces. This article presents a new method involving the fusion of one elevation map obtained by extended depth from focus reconstruction from LM with exactly the same area by SEM and associated techniques, as X-ray mapping. The true topographic information is perfectly associated to local fracture mechanisms with this new technique, presented here as an alternative to stereo-pair reconstruction for the investigation of fractured components. The great advantage of this technique resides in the possibility of combining any imaging methods associated with LM and SEM for the same observed field from fracture surface.
ERIC Educational Resources Information Center
de Laat, Maarten; Lally, Vic; Lipponen, Lasse; Simons, Robert-Jan
2007-01-01
The focus of this study is to explore the advances that Social Network Analysis (SNA) can bring, in combination with other methods, when studying Networked Learning/Computer-Supported Collaborative Learning (NL/CSCL). We present a general overview of how SNA is applied in NL/CSCL research; we then go on to illustrate how this research method can…
A combination of selected mapping and clipping to increase energy efficiency of OFDM systems
Lee, Byung Moo; Rim, You Seung
2017-01-01
We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591
Albin, Thomas J; Vink, Peter
2014-11-01
Designers and ergonomists may occasionally be limited to using tables of percentiles of anthropometric data to model users. Design models that add or subtract percentiles produce unreliable estimates of the proportion of users accommodated, in part because they assume a perfect correlation between variables. Percentile data do not allow the use of more reliable modeling methods such as Principle Component Analysis. A better method is needed. A new method for modeling with limited data is described. It uses measures of central tendency (median or mean) of the range of possible correlation values to estimate the combined variance is shown to reduce error compared to combining percentiles. Second, use of the Chebyshev inequality allows the designer to more reliably estimate the percent accommodation when the distributions of the underlying anthropometric data are unknown than does combining percentiles. This paper describes a modeling method that is more accurate than combining percentiles when only limited data are available. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Using qualitative comparative analysis in a systematic review of a complex intervention.
Kahwati, Leila; Jacobs, Sara; Kane, Heather; Lewis, Megan; Viswanathan, Meera; Golin, Carol E
2016-05-04
Systematic reviews evaluating complex interventions often encounter substantial clinical heterogeneity in intervention components and implementation features making synthesis challenging. Qualitative comparative analysis (QCA) is a non-probabilistic method that uses mathematical set theory to study complex phenomena; it has been proposed as a potential method to complement traditional evidence synthesis in reviews of complex interventions to identify key intervention components or implementation features that might explain effectiveness or ineffectiveness. The objective of this study was to describe our approach in detail and examine the suitability of using QCA within the context of a systematic review. We used data from a completed systematic review of behavioral interventions to improve medication adherence to conduct two substantive analyses using QCA. The first analysis sought to identify combinations of nine behavior change techniques/components (BCTs) found among effective interventions, and the second analysis sought to identify combinations of five implementation features (e.g., agent, target, mode, time span, exposure) found among effective interventions. For each substantive analysis, we reframed the review's research questions to be designed for use with QCA, calibrated sets (i.e., transformed raw data into data used in analysis), and identified the necessary and/or sufficient combinations of BCTs and implementation features found in effective interventions. Our application of QCA for each substantive analysis is described in detail. We extended the original review findings by identifying seven combinations of BCTs and four combinations of implementation features that were sufficient for improving adherence. We found reasonable alignment between several systematic review steps and processes used in QCA except that typical approaches to study abstraction for some intervention components and features did not support a robust calibration for QCA. QCA was suitable for use within a systematic review of medication adherence interventions and offered insights beyond the single dimension stratifications used in the original completed review. Future prospective use of QCA during a review is needed to determine the optimal way to efficiently integrate QCA into existing approaches to evidence synthesis of complex interventions.
NASA Astrophysics Data System (ADS)
Bratchenko, Ivan A.; Artemyev, Dmitry N.; Myakinin, Oleg O.; Khristoforova, Yulia A.; Moryatov, Alexander A.; Kozlov, Sergey V.; Zakharov, Valery P.
2017-02-01
The differentiation of skin melanomas and basal cell carcinomas (BCCs) was demonstrated based on combined analysis of Raman and autofluorescence spectra stimulated by visible and NIR lasers. It was ex vivo tested on 39 melanomas and 40 BCCs. Six spectroscopic criteria utilizing information about alteration of melanin, porphyrins, flavins, lipids, and collagen content in tumor with a comparison to healthy skin were proposed. The measured correlation between the proposed criteria makes it possible to define weakly correlated criteria groups for discriminant analysis and principal components analysis application. It was shown that the accuracy of cancerous tissues classification reaches 97.3% for a combined 6-criteria multimodal algorithm, while the accuracy determined separately for each modality does not exceed 79%. The combined 6-D method is a rapid and reliable tool for malignant skin detection and classification.
Interaction Analysis of Longevity Interventions Using Survival Curves.
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-06
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans . We find that interactions are generally weak even when the standard analysis indicates otherwise.
Interaction Analysis of Longevity Interventions Using Survival Curves
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G.; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-01
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans. We find that interactions are generally weak even when the standard analysis indicates otherwise. PMID:29316622
Jurczyszyn, Kamil; Osiecka, Beata J; Ziółkowski, Piotr
2012-01-01
Fractal dimension analysis (FDA) is modern mathematical method widely used to describing of complex and chaotic shapes when classic methods fail. The main aim of this study was evaluating the influence of photodynamic therapy (PDT) with cystein proteases inhibitors (CPI) on the number and morphology of blood vessels inside tumor and on increase of effectiveness of combined therapy in contrast to PDT and CPI used separately. Animals were divided into four groups: control, treated using only PDT, treated using only CPI and treated using combined therapy, PDT and CPI. Results showed that time of animal survival and depth of necrosis inside tumor were significantly higher in CPI+PDT group in contrast to other groups. The higher value of fractal dimension (FD) was observed in control group, while the lowest value was found in the group which was treated by cystein protease inhibitors. The differences between FD were observed in CPI group and PDT+CPI group in comparison to control group. Our results revealed that fractal dimension analysis is a very useful tool in estimating differences between irregular shapes like blood vessels in PDT treated tumors. Thus, the implementation of FDA algorithms could be useful method in evaluating the efficacy of PDT.
Jurczyszyn, Kamil; Osiecka, Beata J.; Ziółkowski, Piotr
2012-01-01
Fractal dimension analysis (FDA) is modern mathematical method widely used to describing of complex and chaotic shapes when classic methods fail. The main aim of this study was evaluating the influence of photodynamic therapy (PDT) with cystein proteases inhibitors (CPI) on the number and morphology of blood vessels inside tumor and on increase of effectiveness of combined therapy in contrast to PDT and CPI used separately. Animals were divided into four groups: control, treated using only PDT, treated using only CPI and treated using combined therapy, PDT and CPI. Results showed that time of animal survival and depth of necrosis inside tumor were significantly higher in CPI+PDT group in contrast to other groups. The higher value of fractal dimension (FD) was observed in control group, while the lowest value was found in the group which was treated by cystein protease inhibitors. The differences between FD were observed in CPI group and PDT+CPI group in comparison to control group. Our results revealed that fractal dimension analysis is a very useful tool in estimating differences between irregular shapes like blood vessels in PDT treated tumors. Thus, the implementation of FDA algorithms could be useful method in evaluating the efficacy of PDT. PMID:22991578
Rohaeti, Eti; Rafi, Mohamad; Syafitri, Utami Dyah; Heryanto, Rudi
2015-02-25
Turmeric (Curcuma longa), java turmeric (Curcuma xanthorrhiza) and cassumunar ginger (Zingiber cassumunar) are widely used in traditional Indonesian medicines (jamu). They have similar color for their rhizome and possess some similar uses, so it is possible to substitute one for the other. The identification and discrimination of these closely-related plants is a crucial task to ensure the quality of the raw materials. Therefore, an analytical method which is rapid, simple and accurate for discriminating these species using Fourier transform infrared spectroscopy (FTIR) combined with some chemometrics methods was developed. FTIR spectra were acquired in the mid-IR region (4000-400 cm(-1)). Standard normal variate, first and second order derivative spectra were compared for the spectral data. Principal component analysis (PCA) and canonical variate analysis (CVA) were used for the classification of the three species. Samples could be discriminated by visual analysis of the FTIR spectra by using their marker bands. Discrimination of the three species was also possible through the combination of the pre-processed FTIR spectra with PCA and CVA, in which CVA gave clearer discrimination. Subsequently, the developed method could be used for the identification and discrimination of the three closely-related plant species. Copyright © 2014 Elsevier B.V. All rights reserved.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Optimization benefits analysis in production process of fabrication components
NASA Astrophysics Data System (ADS)
Prasetyani, R.; Rafsanjani, A. Y.; Rimantho, D.
2017-12-01
The determination of an optimal number of product combinations is important. The main problem at part and service department in PT. United Tractors Pandu Engineering (shortened to PT.UTPE) Is the optimization of the combination of fabrication component products (known as Liner Plate) which influence to the profit that will be obtained by the company. Liner Plate is a fabrication component that serves as a protector of core structure for heavy duty attachment, such as HD Vessel, HD Bucket, HD Shovel, and HD Blade. The graph of liner plate sales from January to December 2016 has fluctuated and there is no direct conclusion about the optimization of production of such fabrication components. The optimal product combination can be achieved by calculating and plotting the amount of production output and input appropriately. The method that used in this study is linear programming methods with primal, dual, and sensitivity analysis using QM software for Windows to obtain optimal fabrication components. In the optimal combination of components, PT. UTPE provide the profit increase of Rp. 105,285,000.00 for a total of Rp. 3,046,525,000.00 per month and the production of a total combination of 71 units per unit variance per month.
Mao, Zhi-Hua; Yin, Jian-Hua; Zhang, Xue-Xi; Wang, Xiao; Xia, Yang
2016-01-01
Fourier transform infrared spectroscopic imaging (FTIRI) technique can be used to obtain the quantitative information of content and spatial distribution of principal components in cartilage by combining with chemometrics methods. In this study, FTIRI combining with principal component analysis (PCA) and Fisher’s discriminant analysis (FDA) was applied to identify the healthy and osteoarthritic (OA) articular cartilage samples. Ten 10-μm thick sections of canine cartilages were imaged at 6.25μm/pixel in FTIRI. The infrared spectra extracted from the FTIR images were imported into SPSS software for PCA and FDA. Based on the PCA result of 2 principal components, the healthy and OA cartilage samples were effectively discriminated by the FDA with high accuracy of 94% for the initial samples (training set) and cross validation, as well as 86.67% for the prediction group. The study showed that cartilage degeneration became gradually weak with the increase of the depth. FTIRI combined with chemometrics may become an effective method for distinguishing healthy and OA cartilages in future. PMID:26977354
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. Copyright © 2016 Elsevier B.V. All rights reserved.
Enhancing the Modeling of PFOA Pharmacokinetics with Bayesian Analysis
The detail sufficient to describe the pharmacokinetics (PK) for perfluorooctanoic acid (PFOA) and the methods necessary to combine information from multiple data sets are both subjects of ongoing investigation. Bayesian analysis provides tools to accommodate these goals. We exa...
Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.
Segovia, F; Górriz, J M; Ramírez, J; Phillips, C
2016-01-01
Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.
Meijer, Erik; Rohwedder, Susann; Wansbeek, Tom
2012-01-01
Survey data on earnings tend to contain measurement error. Administrative data are superior in principle, but they are worthless in case of a mismatch. We develop methods for prediction in mixture factor analysis models that combine both data sources to arrive at a single earnings figure. We apply the methods to a Swedish data set. Our results show that register earnings data perform poorly if there is a (small) probability of a mismatch. Survey earnings data are more reliable, despite their measurement error. Predictors that combine both and take conditional class probabilities into account outperform all other predictors.
Dong, Chongmei; Vincent, Kate; Sharp, Peter
2009-12-04
TILLING (Targeting Induced Local Lesions IN Genomes) is a powerful tool for reverse genetics, combining traditional chemical mutagenesis with high-throughput PCR-based mutation detection to discover induced mutations that alter protein function. The most popular mutation detection method for TILLING is a mismatch cleavage assay using the endonuclease CelI. For this method, locus-specific PCR is essential. Most wheat genes are present as three similar sequences with high homology in exons and low homology in introns. Locus-specific primers can usually be designed in introns. However, it is sometimes difficult to design locus-specific PCR primers in a conserved region with high homology among the three homoeologous genes, or in a gene lacking introns, or if information on introns is not available. Here we describe a mutation detection method which combines High Resolution Melting (HRM) analysis of mixed PCR amplicons containing three homoeologous gene fragments and sequence analysis using Mutation Surveyor software, aimed at simultaneous detection of mutations in three homoeologous genes. We demonstrate that High Resolution Melting (HRM) analysis can be used in mutation scans in mixed PCR amplicons containing three homoeologous gene fragments. Combining HRM scanning with sequence analysis using Mutation Surveyor is sensitive enough to detect a single nucleotide mutation in the heterozygous state in a mixed PCR amplicon containing three homoeoloci. The method was tested and validated in an EMS (ethylmethane sulfonate)-treated wheat TILLING population, screening mutations in the carboxyl terminal domain of the Starch Synthase II (SSII) gene. Selected identified mutations of interest can be further analysed by cloning to confirm the mutation and determine the genomic origin of the mutation. Polyploidy is common in plants. Conserved regions of a gene often represent functional domains and have high sequence similarity between homoeologous loci. The method described here is a useful alternative to locus-specific based methods for screening mutations in conserved functional domains of homoeologous genes. This method can also be used for SNP (single nucleotide polymorphism) marker development and eco-TILLING in polyploid species.
A method of power analysis based on piecewise discrete Fourier transform
NASA Astrophysics Data System (ADS)
Xin, Miaomiao; Zhang, Yanchi; Xie, Da
2018-04-01
The paper analyzes the existing feature extraction methods. The characteristics of discrete Fourier transform and piecewise aggregation approximation are analyzed. Combining with the advantages of the two methods, a new piecewise discrete Fourier transform is proposed. And the method is used to analyze the lighting power of a large customer in this paper. The time series feature maps of four different cases are compared with the original data, discrete Fourier transform, piecewise aggregation approximation and piecewise discrete Fourier transform. This new method can reflect both the overall trend of electricity change and its internal changes in electrical analysis.
Du, Lijuan; Lu, Weiying; Cai, Zhenzhen Julia; Bao, Lei; Hartmann, Christoph; Gao, Boyan; Yu, Liangli Lucy
2018-02-01
Flow injection mass spectrometry (FIMS) combined with chemometrics was evaluated for rapidly detecting economically motivated adulteration (EMA) of milk. Twenty-two pure milk and thirty-five counterparts adulterated with soybean, pea, and whey protein isolates at 0.5, 1, 3, 5, and 10% (w/w) levels were analyzed. The principal component analysis (PCA), partial least-squares-discriminant analysis (PLS-DA), and support vector machine (SVM) classification models indicated that the adulterated milks could successfully be classified from the pure milks. FIMS combined with chemometrics might be an effective method to detect possible EMA in milk. Copyright © 2017 Elsevier Ltd. All rights reserved.
Liang, Zhenwei; Li, Yaoming; Zhao, Zhan; Xu, Lizhang
2015-01-01
Grain separation losses is a key parameter to weigh the performance of combine harvesters, and also a dominant factor for automatically adjusting their major working parameters. The traditional separation losses monitoring method mainly rely on manual efforts, which require a high labor intensity. With recent advancements in sensor technology, electronics and computational processing power, this paper presents an indirect method for monitoring grain separation losses in tangential-axial combine harvesters in real-time. Firstly, we developed a mathematical monitoring model based on detailed comparative data analysis of different feeding quantities. Then, we developed a grain impact piezoelectric sensor utilizing a YT-5 piezoelectric ceramic as the sensing element, and a signal process circuit designed according to differences in voltage amplitude and rise time of collision signals. To improve the sensor performance, theoretical analysis was performed from a structural vibration point of view, and the optimal sensor structural has been selected. Grain collide experiments have shown that the sensor performance was greatly improved. Finally, we installed the sensor on a tangential-longitudinal axial combine harvester, and grain separation losses monitoring experiments were carried out in North China, which results have shown that the monitoring method was feasible, and the biggest measurement relative error was 4.63% when harvesting rice. PMID:25594592
Liang, Zhenwei; Li, Yaoming; Zhao, Zhan; Xu, Lizhang
2015-01-14
Grain separation losses is a key parameter to weigh the performance of combine harvesters, and also a dominant factor for automatically adjusting their major working parameters. The traditional separation losses monitoring method mainly rely on manual efforts, which require a high labor intensity. With recent advancements in sensor technology, electronics and computational processing power, this paper presents an indirect method for monitoring grain separation losses in tangential-axial combine harvesters in real-time. Firstly, we developed a mathematical monitoring model based on detailed comparative data analysis of different feeding quantities. Then, we developed a grain impact piezoelectric sensor utilizing a YT-5 piezoelectric ceramic as the sensing element, and a signal process circuit designed according to differences in voltage amplitude and rise time of collision signals. To improve the sensor performance, theoretical analysis was performed from a structural vibration point of view, and the optimal sensor structural has been selected. Grain collide experiments have shown that the sensor performance was greatly improved. Finally, we installed the sensor on a tangential-longitudinal axial combine harvester, and grain separation losses monitoring experiments were carried out in North China, which results have shown that the monitoring method was feasible, and the biggest measurement relative error was 4.63% when harvesting rice.
Kajiyama, Shin'ichiro; Harada, Kazuo; Fukusaki, Eiichiro; Kobayashi, Akio
2006-12-01
The molecular constituents of the petal pigments of the Torenia plant (Torenia hybrida) were analyzed on a single-cell basis by a combination of newly developed laser-microsampling and nano-flow liquid chromatography-electro spray ionization mass spectrometry (LC-ESIMS) techniques. Our method should provide a facile method for obtaining precise metabolic profiles of each cell in a single plant tissue.
NASA Astrophysics Data System (ADS)
Yuanyuan, Xu; Zhengmao, Zhang; Xiang, Fang; Yuanshuai, Xu; Xinxin, Song
2018-03-01
The combination of theory and practice is a difficult problem on dispatcher training. Through a typical example of case, this paper provides an effective case teaching method for dispatcher training, and combines the theoretical discussion of the rule of experience with cases and achieves vividness. It helps students to understand and catch the key points of the theory, and improve their practical skills.
Root Gravitropism: Quantification, Challenges, and Solutions.
Muller, Lukas; Bennett, Malcolm J; French, Andy; Wells, Darren M; Swarup, Ranjan
2018-01-01
Better understanding of root traits such as root angle and root gravitropism will be crucial for development of crops with improved resource use efficiency. This chapter describes a high-throughput, automated image analysis method to trace Arabidopsis (Arabidopsis thaliana) seedling roots grown on agar plates. The method combines a "particle-filtering algorithm with a graph-based method" to trace the center line of a root and can be adopted for the analysis of several root parameters such as length, curvature, and stimulus from original root traces.
Convergence analysis of a monotonic penalty method for American option pricing
NASA Astrophysics Data System (ADS)
Zhang, Kai; Yang, Xiaoqi; Teo, Kok Lay
2008-12-01
This paper is devoted to study the convergence analysis of a monotonic penalty method for pricing American options. A monotonic penalty method is first proposed to solve the complementarity problem arising from the valuation of American options, which produces a nonlinear degenerated parabolic PDE with Black-Scholes operator. Based on the variational theory, the solvability and convergence properties of this penalty approach are established in a proper infinite dimensional space. Moreover, the convergence rate of the combination of two power penalty functions is obtained.
NASA Astrophysics Data System (ADS)
Huang, Jian; Yuen, Pong C.; Chen, Wen-Sheng; Lai, J. H.
2005-05-01
Many face recognition algorithms/systems have been developed in the last decade and excellent performances have also been reported when there is a sufficient number of representative training samples. In many real-life applications such as passport identification, only one well-controlled frontal sample image is available for training. Under this situation, the performance of existing algorithms will degrade dramatically or may not even be implemented. We propose a component-based linear discriminant analysis (LDA) method to solve the one training sample problem. The basic idea of the proposed method is to construct local facial feature component bunches by moving each local feature region in four directions. In this way, we not only generate more samples with lower dimension than the original image, but also consider the face detection localization error while training. After that, we propose a subspace LDA method, which is tailor-made for a small number of training samples, for the local feature projection to maximize the discrimination power. Theoretical analysis and experiment results show that our proposed subspace LDA is efficient and overcomes the limitations in existing LDA methods. Finally, we combine the contributions of each local component bunch with a weighted combination scheme to draw the recognition decision. A FERET database is used for evaluating the proposed method and results are encouraging.
Cluster Correspondence Analysis.
van de Velden, M; D'Enza, A Iodice; Palumbo, F
2017-03-01
A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.
Who's in and why? A typology of stakeholder analysis methods for natural resource management.
Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C
2009-04-01
Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
2016-12-01
chosen rather than complex ones , and responds to the criticism of the DTA approach. Chapter IV provides three separate case studies in defense R&D...defense R&D projects. To this end, the first section describes the case study method and the advantages of using simple models over more complex ones ...the analysis lacked empirical data and relied on subjective data, the analysis successfully combined the DTA approach with the case study method and
NASA Astrophysics Data System (ADS)
Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki
2015-10-01
For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.
NASA Technical Reports Server (NTRS)
Whitlow, W., Jr.; Bennett, R. M.
1982-01-01
Since the aerodynamic theory is nonlinear, the method requires the coupling of two iterative processes - an aerodynamic analysis and a structural analysis. A full potential analysis code, FLO22, is combined with a linear structural analysis to yield aerodynamic load distributions on and deflections of elastic wings. This method was used to analyze an aeroelastically-scaled wind tunnel model of a proposed executive-jet transport wing and an aeroelastic research wing. The results are compared with the corresponding rigid-wing analyses, and some effects of elasticity on the aerodynamic loading are noted.
Race and Older Mothers’ Differentiation: A Sequential Quantitative and Qualitative Analysis
Sechrist, Jori; Suitor, J. Jill; Riffin, Catherine; Taylor-Watson, Kadari; Pillemer, Karl
2011-01-01
The goal of this paper is to demonstrate a process by which qualitative and quantitative approaches are combined to reveal patterns in the data that are unlikely to be detected and confirmed by either method alone. Specifically, we take a sequential approach to combining qualitative and quantitative data to explore race differences in how mothers differentiate among their adult children. We began with a standard multivariate analysis examining race differences in mothers’ differentiation among their adult children regarding emotional closeness and confiding. Finding no race differences in this analysis, we conducted an in-depth comparison of the Black and White mothers’ narratives to determine whether there were underlying patterns that we had been unable to detect in our first analysis. Using this method, we found that Black mothers were substantially more likely than White mothers to emphasize interpersonal relationships within the family when describing differences among their children. In our final step, we developed a measure of familism based on the qualitative data and conducted a multivariate analysis to confirm the patterns revealed by the in-depth comparison of the mother’s narratives. We conclude that using such a sequential mixed methods approach to data analysis has the potential to shed new light on complex family relations. PMID:21967639
Real time chemical exposure and risk monitor
Thrall, Karla D.; Kenny, Donald V.; Endres, George W. R.; Sisk, Daniel R.
1997-01-01
The apparatus of the present invention is a combination of a breath interface and an external exposure dosimeter interface to a chemical analysis device, all controlled by an electronic processor for quantitatively analyzing chemical analysis data from both the breath interface and the external exposure dosimeter for determining internal tissue dose. The method of the present invention is a combination of steps of measuring an external dose, measuring breath content, then analyzing the external dose and breath content and determining internal tissue dose.
Real time chemical exposure and risk monitor
Thrall, K.D.; Kenny, D.V.; Endres, G.W.R.; Sisk, D.R.
1997-07-08
The apparatus of the present invention is a combination of a breath interface and an external exposure dosimeter interface to a chemical analysis device, all controlled by an electronic processor for quantitatively analyzing chemical analysis data from both the breath interface and the external exposure dosimeter for determining internal tissue dose. The method of the present invention is a combination of steps of measuring an external dose, measuring breath content, then analyzing the external dose and breath content and determining internal tissue dose. 7 figs.
Achieving Methodological Alignment When Combining QCA and Process Tracing in Practice
ERIC Educational Resources Information Center
Beach, Derek
2018-01-01
This article explores the practical challenges one faces when combining qualitative comparative analysis (QCA) and process tracing (PT) in a manner that is consistent with their underlying assumptions about the nature of causal relationships. While PT builds on a mechanism-based understanding of causation, QCA as a comparative method makes claims…
7 CFR 800.85 - Inspection of grain in combined lots.
Code of Federal Regulations, 2010 CFR
2010-01-01
... REGULATIONS Inspection Methods and Procedures § 800.85 Inspection of grain in combined lots. (a) General. The...) Weighted or mathematical average. Official factor and official criteria information shown on a certificate... section, be based on the weighted or mathematical averages of the analysis of the sublots in the lot and...
A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.
Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C
2017-07-01
Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1996-01-01
Solving for dynamic responses of free-free launch vehicle/spacecraft systems acted upon by buffeting winds is commonly performed throughout the aerospace industry. Due to the unpredictable nature of this wind loading event, these problems are typically solved using frequency response random analysis techniques. To generate dynamic responses for spacecraft with statically-indeterminate interfaces, spacecraft contractors prefer to develop models which have response transformation matrices developed for mode acceleration data recovery. This method transforms spacecraft boundary accelerations and displacements into internal responses. Unfortunately, standard MSC/NASTRAN modal frequency response solution sequences cannot be used to combine acceleration- and displacement-dependent responses required for spacecraft mode acceleration data recovery. External user-written computer codes can be used with MSC/NASTRAN output to perform such combinations, but these methods can be labor and computer resource intensive. Taking advantage of the analytical and computer resource efficiencies inherent within MS C/NASTRAN, a DMAP Alter has been developed to combine acceleration- and displacement-dependent modal frequency responses for performing spacecraft mode acceleration data recovery. The Alter has been used successfully to efficiently solve a common aerospace buffeting wind analysis.
The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...
The composition of heterogeneous control laws
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin; Astrom, Karl
1991-01-01
The fuzzy control literature and industrial practice provide certain nonlinear methods for combining heterogeneous control laws, but these methods have been very difficult to analyze theoretically. An alternate formulation and extension of this approach is presented that has several practical and theoretical benefits. An example of heterogeneous control is given and two alternate analysis methods are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yunin, P. A., E-mail: yunin@ipmras.ru; Drozdov, Yu. N.; Drozdov, M. N.
2013-12-15
In this publication, we report the results of studying a multilayerd nonperiodic SiGe/Si structure by the methods of X-ray diffractometry, grazing-angle X-ray reflectometry, and secondary-ion mass spectrometry (SIMS). Special attention is paid to the processing of the component distribution profile using the SIMS method and to consideration of the most significant experimental distortions introduced by this method. A method for processing the measured composition distribution profile with subsequent consideration of the influence of matrix effects, variation in the etching rate, and remnants of ion sputtering is suggested. The results of such processing are compared with a structure model obtained uponmore » combined analysis of X-ray diffractometry and grazing-angle reflectometry data. Good agreement between the results is established. It is shown that the combined use of independent techniques makes it possible to improve the methods of secondary-ion mass spectrometry and grazing-incidence reflectometry as applied to an analysis of multilayered heteroepitaxial structures (to increase the accuracy and informativity of these methods)« less
Improvement of the GRACE star camera data based on the revision of the combination method
NASA Astrophysics Data System (ADS)
Bandikova, Tamara; Flury, Jakob
2014-11-01
The new release of the sensor and instrument data (Level-1B release 02) of the Gravity Recovery and Climate Experiment (GRACE) had a substantial impact on the improvement of the overall accuracy of the gravity field models. This has implied that improvements on the sensor data level can still significantly contribute to arriving closer to the GRACE baseline accuracy. The recent analysis of the GRACE star camera data (SCA1B RL02) revealed their unexpectedly higher noise. As the star camera (SCA) data are essential for the processing of the K-band ranging data and the accelerometer data, thorough investigation of the data set was needed. We fully reexamined the SCA data processing from Level-1A to Level-1B with focus on the combination method of the data delivered by the two SCA heads. In the first step, we produced and compared our own combined attitude solution by applying two different combination methods on the SCA Level-1A data. The first method introduces the information about the anisotropic accuracy of the star camera measurement in terms of a weighing matrix. This method was applied in the official processing as well. The alternative method merges only the well determined SCA boresight directions. This method was implemented on the GRACE SCA data for the first time. Both methods were expected to provide optimal solution characteristic by the full accuracy about all three axes, which was confirmed. In the second step, we analyzed the differences between the official SCA1B RL02 data generated by the Jet Propulsion Laboratory (JPL) and our solution. SCA1B RL02 contains systematically higher noise of about a factor 3-4. The data analysis revealed that the reason is the incorrect implementation of algorithms in the JPL processing routines. After correct implementation of the combination method, significant improvement within the whole spectrum was achieved. Based on these results, the official reprocessing of the SCA data is suggested, as the SCA attitude data are one of the key observations needed for the gravity field recovery.
Pathway Distiller - multisource biological pathway consolidation.
Doderer, Mark S; Anguiano, Zachry; Suresh, Uthra; Dashnamoorthy, Ravi; Bishop, Alexander J R; Chen, Yidong
2012-01-01
One method to understand and evaluate an experiment that produces a large set of genes, such as a gene expression microarray analysis, is to identify overrepresentation or enrichment for biological pathways. Because pathways are able to functionally describe the set of genes, much effort has been made to collect curated biological pathways into publicly accessible databases. When combining disparate databases, highly related or redundant pathways exist, making their consolidation into pathway concepts essential. This will facilitate unbiased, comprehensive yet streamlined analysis of experiments that result in large gene sets. After gene set enrichment finds representative pathways for large gene sets, pathways are consolidated into representative pathway concepts. Three complementary, but different methods of pathway consolidation are explored. Enrichment Consolidation combines the set of the pathways enriched for the signature gene list through iterative combining of enriched pathways with other pathways with similar signature gene sets; Weighted Consolidation utilizes a Protein-Protein Interaction network based gene-weighting approach that finds clusters of both enriched and non-enriched pathways limited to the experiments' resultant gene list; and finally the de novo Consolidation method uses several measurements of pathway similarity, that finds static pathway clusters independent of any given experiment. We demonstrate that the three consolidation methods provide unified yet different functional insights of a resultant gene set derived from a genome-wide profiling experiment. Results from the methods are presented, demonstrating their applications in biological studies and comparing with a pathway web-based framework that also combines several pathway databases. Additionally a web-based consolidation framework that encompasses all three methods discussed in this paper, Pathway Distiller (http://cbbiweb.uthscsa.edu/PathwayDistiller), is established to allow researchers access to the methods and example microarray data described in this manuscript, and the ability to analyze their own gene list by using our unique consolidation methods. By combining several pathway systems, implementing different, but complementary pathway consolidation methods, and providing a user-friendly web-accessible tool, we have enabled users the ability to extract functional explanations of their genome wide experiments.
NASA Astrophysics Data System (ADS)
Sukmono, Abdi; Ardiansyah
2017-01-01
Paddy is one of the most important agricultural crop in Indonesia. Indonesia’s consumption of rice per capita in 2013 amounted to 78,82 kg/capita/year. In 2017, the Indonesian government has the mission of realizing Indonesia became self-sufficient in food. Therefore, the Indonesian government should be able to seek the stability of the fulfillment of basic needs for food, such as rice field mapping. The accurate mapping for rice field can use a quick and easy method such as Remote Sensing. In this study, multi-temporal Landsat 8 are used for identification of rice field based on Rice Planting Time. It was combined with other method for extract information from the imagery. The methods which was used Normalized Difference Vegetation Index (NDVI), Principal Component Analysis (PCA) and band combination. Image classification is processed by using nine classes, those are water, settlements, mangrove, gardens, fields, rice fields 1st, rice fields 2nd, rice fields 3rd and rice fields 4th. The results showed the rice fields area obtained from the PCA method was 50,009 ha, combination bands was 51,016 ha and NDVI method was 45,893 ha. The accuracy level was obtained PCA method (84.848%), band combination (81.818%), and NDVI method (75.758%).
Meta-Analysis of Tumor Stem-Like Breast Cancer Cells Using Gene Set and Network Analysis
Lee, Won Jun; Kim, Sang Cheol; Yoon, Jung-Ho; Yoon, Sang Jun; Lim, Johan; Kim, You-Sun; Kwon, Sung Won; Park, Jeong Hill
2016-01-01
Generally, cancer stem cells have epithelial-to-mesenchymal-transition characteristics and other aggressive properties that cause metastasis. However, there have been no confident markers for the identification of cancer stem cells and comparative methods examining adherent and sphere cells are widely used to investigate mechanism underlying cancer stem cells, because sphere cells have been known to maintain cancer stem cell characteristics. In this study, we conducted a meta-analysis that combined gene expression profiles from several studies that utilized tumorsphere technology to investigate tumor stem-like breast cancer cells. We used our own gene expression profiles along with the three different gene expression profiles from the Gene Expression Omnibus, which we combined using the ComBat method, and obtained significant gene sets using the gene set analysis of our datasets and the combined dataset. This experiment focused on four gene sets such as cytokine-cytokine receptor interaction that demonstrated significance in both datasets. Our observations demonstrated that among the genes of four significant gene sets, six genes were consistently up-regulated and satisfied the p-value of < 0.05, and our network analysis showed high connectivity in five genes. From these results, we established CXCR4, CXCL1 and HMGCS1, the intersecting genes of the datasets with high connectivity and p-value of < 0.05, as significant genes in the identification of cancer stem cells. Additional experiment using quantitative reverse transcription-polymerase chain reaction showed significant up-regulation in MCF-7 derived sphere cells and confirmed the importance of these three genes. Taken together, using meta-analysis that combines gene set and network analysis, we suggested CXCR4, CXCL1 and HMGCS1 as candidates involved in tumor stem-like breast cancer cells. Distinct from other meta-analysis, by using gene set analysis, we selected possible markers which can explain the biological mechanisms and suggested network analysis as an additional criterion for selecting candidates. PMID:26870956
Structured Case Analysis: Developing Critical Thinking Skills in a Marketing Case Course
ERIC Educational Resources Information Center
Klebba, Joanne M.; Hamilton, Janet G.
2007-01-01
Structured case analysis is a hybrid pedagogy that flexibly combines diverse instructional methods with comprehensive case analysis as a mechanism to develop critical thinking skills. An incremental learning framework is proposed that allows instructors to develop and monitor content-specific theory and the corresponding critical thinking skills.…
Demodulation circuit for AC motor current spectral analysis
Hendrix, Donald E.; Smith, Stephen F.
1990-12-18
A motor current analysis method for the remote, noninvasive inspection of electric motor-operated systems. Synchronous amplitude demodulation and phase demodulation circuits are used singly and in combination along with a frequency analyzer to produce improved spectral analysis of load-induced frequencies present in the electric current flowing in a motor-driven system.
ERIC Educational Resources Information Center
Katz-Gerro, Tally; Talmud, Ilan
2005-01-01
This paper proposes a new analysis of consumption inequality using relational methods, derived from network images of social structure. We combine structural analysis with theoretical concerns in consumer research to propose a relational theory of consumption space, to construct a stratification indicator, and to demonstrate its analytical…
ERIC Educational Resources Information Center
Mulik, James D.; Sawicki, Eugene
1979-01-01
Accurate for the analysis of ions in solution, this form of analysis enables the analyst to directly assay many compounds that previously were difficult or impossible to analyze. The method is a combination of the methodologies of ion exchange, liquid chromatography, and conductimetric determination with eluant suppression. (Author/RE)
Safety and business benefit analysis of NASA's aviation safety program
DOT National Transportation Integrated Search
2004-09-20
NASA Aviation Safety Program elements encompass a wide range of products that require both public and private investment. Therefore, two methods of analysis, one relating to the public and the other to the private industry, must be combined to unders...
Combining Correlation Matrices: Simulation Analysis of Improved Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2007-01-01
The originally proposed multivariate meta-analysis approach for correlation matrices--analyze Pearson correlations, with each study's observed correlations replacing their population counterparts in its conditional-covariance matrix--performs poorly. Two refinements are considered: Analyze Fisher Z-transformed correlations, and substitute better…
Soil carbon analysis using gamma rays induced by neutrons
USDA-ARS?s Scientific Manuscript database
Agronomy is a research field where various physics concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. The evolution of methodology and instrumentation of nuclear physics combined with the availability of not highly expensive commercial prod...
Variable Selection in the Presence of Missing Data: Imputation-based Methods.
Zhao, Yize; Long, Qi
2017-01-01
Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.
Yoshida, Tsuyoshi; Kobayashi, Takumi; Itoda, Masaya; Muto, Taika; Miyaguchi, Ken; Mogushi, Kaoru; Shoji, Satoshi; Shimokawa, Kazuro; Iida, Satoru; Uetake, Hiroyuki; Ishikawa, Toshiaki; Sugihara, Kenichi; Mizushima, Hiroshi; Tanaka, Hiroshi
2010-07-29
Colorectal cancer (CRC) is one of the most frequently occurring cancers in Japan, and thus a wide range of methods have been deployed to study the molecular mechanisms of CRC. In this study, we performed a comprehensive analysis of CRC, incorporating copy number aberration (CRC) and gene expression data. For the last four years, we have been collecting data from CRC cases and organizing the information as an "omics" study by integrating many kinds of analysis into a single comprehensive investigation. In our previous studies, we had experienced difficulty in finding genes related to CRC, as we observed higher noise levels in the expression data than in the data for other cancers. Because chromosomal aberrations are often observed in CRC, here, we have performed a combination of CNA analysis and expression analysis in order to identify some new genes responsible for CRC. This study was performed as part of the Clinical Omics Database Project at Tokyo Medical and Dental University. The purpose of this study was to investigate the mechanism of genetic instability in CRC by this combination of expression analysis and CNA, and to establish a new method for the diagnosis and treatment of CRC. Comprehensive gene expression analysis was performed on 79 CRC cases using an Affymetrix Gene Chip, and comprehensive CNA analysis was performed using an Affymetrix DNA Sty array. To avoid the contamination of cancer tissue with normal cells, laser micro-dissection was performed before DNA/RNA extraction. Data analysis was performed using original software written in the R language. We observed a high percentage of CNA in colorectal cancer, including copy number gains at 7, 8q, 13 and 20q, and copy number losses at 8p, 17p and 18. Gene expression analysis provided many candidates for CRC-related genes, but their association with CRC did not reach the level of statistical significance. The combination of CNA and gene expression analysis, together with the clinical information, suggested UGT2B28, LOC440995, CXCL6, SULT1B1, RALBP1, TYMS, RAB12, RNMT, ARHGDIB, S1000A2, ABHD2, OIT3 and ABHD12 as genes that are possibly associated with CRC. Some of these genes have already been reported as being related to CRC. TYMS has been reported as being associated with resistance to the anti-cancer drug 5-fluorouracil, and we observed a copy number increase for this gene. RALBP1, ARHGDIB and S100A2 have been reported as oncogenes, and we observed copy number increases in each. ARHGDIB has been reported as a metastasis-related gene, and our data also showed copy number increases of this gene in cases with metastasis. The combination of CNA analysis and gene expression analysis was a more effective method for finding genes associated with the clinicopathological classification of CRC than either analysis alone. Using this combination of methods, we were able to detect genes that have already been associated with CRC. We also identified additional candidate genes that may be new markers or targets for this form of cancer.
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
Kim, Yeoun Jae; Seo, Jong Hyun; Kim, Hong Rae; Kim, Kwang Gi
2017-06-01
Clinicians who frequently perform ultrasound scanning procedures often suffer from musculoskeletal disorders, arthritis, and myalgias. To minimize their occurrence and to assist clinicians, ultrasound scanning robots have been developed worldwide. Although, to date, there is still no commercially available ultrasound scanning robot, many control methods have been suggested and researched. These control algorithms are either image based or force based. If the ultrasound scanning robot control algorithm was a combination of the two algorithms, it could benefit from the advantage of each one. However, there are no existing control methods for ultrasound scanning robots that combine force control and image analysis. Therefore, in this work, a control algorithm is developed for an ultrasound scanning robot using force feedback and ultrasound image analysis. A manipulator-type ultrasound scanning robot named 'NCCUSR' is developed and a control algorithm for this robot is suggested and verified. First, conventional hybrid position-force control is implemented for the robot and the hybrid position-force control algorithm is combined with ultrasound image analysis to fully control the robot. The control method is verified using a thyroid phantom. It was found that the proposed algorithm can be applied to control the ultrasound scanning robot and experimental outcomes suggest that the images acquired using the proposed control method can yield a rating score that is equivalent to images acquired directly by the clinicians. The proposed control method can be applied to control the ultrasound scanning robot. However, more work must be completed to verify the proposed control method in order to become clinically feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A hybrid method in combining treatment effects from matched and unmatched studies.
Byun, Jinyoung; Lai, Dejian; Luo, Sheng; Risser, Jan; Tung, Betty; Hardy, Robert J
2013-12-10
The most common data structures in the biomedical studies have been matched or unmatched designs. Data structures resulting from a hybrid of the two may create challenges for statistical inferences. The question may arise whether to use parametric or nonparametric methods on the hybrid data structure. The Early Treatment for Retinopathy of Prematurity study was a multicenter clinical trial sponsored by the National Eye Institute. The design produced data requiring a statistical method of a hybrid nature. An infant in this multicenter randomized clinical trial had high-risk prethreshold retinopathy of prematurity that was eligible for treatment in one or both eyes at entry into the trial. During follow-up, recognition visual acuity was accessed for both eyes. Data from both eyes (matched) and from only one eye (unmatched) were eligible to be used in the trial. The new hybrid nonparametric method is a meta-analysis based on combining the Hodges-Lehmann estimates of treatment effects from the Wilcoxon signed rank and rank sum tests. To compare the new method, we used the classic meta-analysis with the t-test method to combine estimates of treatment effects from the paired and two sample t-tests. We used simulations to calculate the empirical size and power of the test statistics, as well as the bias, mean square and confidence interval width of the corresponding estimators. The proposed method provides an effective tool to evaluate data from clinical trials and similar comparative studies. Copyright © 2013 John Wiley & Sons, Ltd.
Liu, Rui-Sang; Jin, Guang-Huai; Xiao, Deng-Rong; Li, Hong-Mei; Bai, Feng-Wu; Tang, Ya-Jie
2015-01-01
Aroma results from the interplay of volatile organic compounds (VOCs) and the attributes of microbial-producing aromas are significantly affected by fermentation conditions. Among the VOCs, only a few of them contribute to aroma. Thus, screening and identification of the key VOCs is critical for microbial-producing aroma. The traditional method is based on gas chromatography-olfactometry (GC-O), which is time-consuming and laborious. Considering the Tuber melanosporum fermentation system as an example, a new method to screen and identify the key VOCs by combining the aroma evaluation method with principle component analysis (PCA) was developed in this work. First, an aroma sensory evaluation method was developed to screen 34 potential favorite aroma samples from 504 fermentation samples. Second, PCA was employed to screen nine common key VOCs from these 34 samples. Third, seven key VOCs were identified by the traditional method. Finally, all of the seven key VOCs identified by the traditional method were also identified, along with four others, by the new strategy. These results indicate the reliability of the new method and demonstrate it to be a viable alternative to the traditional method. PMID:26655663
NASA Astrophysics Data System (ADS)
Geltner, I.; Hashimshony, D.; Zigler, A.
2002-07-01
We use a time-domain analysis method to characterize the outer layer of a multilayer structure regardless of the inner ones, thus simplifying the characterization of all the layers. We combine this method with THz reflection spectroscopy to detect nondestructively a hidden aluminum oxide layer under opaque paint and to measure its conductivity and high-frequency dielectric constant in the THz range.
A comparison of cosegregation analysis methods for the clinical setting.
Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H
2018-04-01
Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.
Wang, X-H; Zhang, G; Fan, Y-Y; Yang, X; Sui, W-J; Lu, X-X
2013-03-01
Rapid identification of bacterial pathogens from clinical specimens is essential to establish an adequate empirical antibiotic therapy to treat urinary tract infections (UTIs). We used matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) combined with UF-1000i urine flow cytometry of urine specimens to quickly and accurately identify bacteria causing UTIs. We divided each urine sample into three aliquots for conventional identification, UF-1000i, and MALDI-TOF MS, respectively. We compared the results of the conventional method with those of MALDI-TOF MS combined with UF-1000i, and discrepancies were resolved by 16S rRNA gene sequencing. We analyzed 1456 urine samples from patients with UTI symptoms, and 932 (64.0%) were negative using each of the three testing methods. The combined method used UF-1000i to eliminate negative specimens and then MALDI-TOF MS to identify the remaining positive samples. The combined method was consistent with the conventional method in 1373 of 1456 cases (94.3%), and gave the correct result in 1381 of 1456 cases (94.8%). Therefore, the combined method described here can directly provide a rapid, accurate, definitive bacterial identification for the vast majority of urine samples, though the MALDI-TOF MS software analysis capabilities should be improved, with regard to mixed bacterial infection. Copyright © 2012 Elsevier B.V. All rights reserved.
Choi, Ted; Eskin, Eleazar
2013-01-01
Gene expression data, in conjunction with information on genetic variants, have enabled studies to identify expression quantitative trait loci (eQTLs) or polymorphic locations in the genome that are associated with expression levels. Moreover, recent technological developments and cost decreases have further enabled studies to collect expression data in multiple tissues. One advantage of multiple tissue datasets is that studies can combine results from different tissues to identify eQTLs more accurately than examining each tissue separately. The idea of aggregating results of multiple tissues is closely related to the idea of meta-analysis which aggregates results of multiple genome-wide association studies to improve the power to detect associations. In principle, meta-analysis methods can be used to combine results from multiple tissues. However, eQTLs may have effects in only a single tissue, in all tissues, or in a subset of tissues with possibly different effect sizes. This heterogeneity in terms of effects across multiple tissues presents a key challenge to detect eQTLs. In this paper, we develop a framework that leverages two popular meta-analysis methods that address effect size heterogeneity to detect eQTLs across multiple tissues. We show by using simulations and multiple tissue data from mouse that our approach detects many eQTLs undetected by traditional eQTL methods. Additionally, our method provides an interpretation framework that accurately predicts whether an eQTL has an effect in a particular tissue. PMID:23785294
Watanabe, Eiki; Kobara, Yuso; Miyake, Shiro
2013-06-01
With the aim of expanding the applicability of a kit-based enzyme-linked immunosorbent assay (ELISA) for the neonicotinoid insecticide thiamethoxam, the ELISA was newly applied to three kinds of agricultural samples (green pepper, eggplant and spinach). To offer the ELISA as a screening analysis for thiamethoxam residues, a rapid and simple method of extraction by hand-shaking was used, and speed-up and simplification of the sample treatment before the ELISA analysis were examined. Finally, the validity of the ELISA combined with the proposed extraction method was verified against a reference high-performance liquid chromatography (HPLC) method using real-world agricultural samples. There were no marked matrix effects derived from green pepper and eggplant extracts. On the other hand, although the effect due to a pigment in spinach extract on the assay performance was significant, it was effectively avoided by increasing the dilution level of the spinach extract. For thiamethoxam-spiked samples, acceptable recoveries of 97.9-109.1% and coefficients of variation of 0.3-11.5% were obtained. Inspection of the validity of the ELISA by comparison with the reference HPLC method showed that the two analytical results were very similar, and a high correlation was found between them (r>0.997). The evaluated ELISA combined with hand-shaking extraction provided a rapid and simple screening analysis that was quantitative and reliable for the detection of thiamethoxam in complex agricultural products. © 2012 Society of Chemical Industry.
Mathematical models for exploring different aspects of genotoxicity and carcinogenicity databases.
Benigni, R; Giuliani, A
1991-12-01
One great obstacle to understanding and using the information contained in the genotoxicity and carcinogenicity databases is the very size of such databases. Their vastness makes them difficult to read; this leads to inadequate exploitation of the information, which becomes costly in terms of time, labor, and money. In its search for adequate approaches to the problem, the scientific community has, curiously, almost entirely neglected an existent series of very powerful methods of data analysis: the multivariate data analysis techniques. These methods were specifically designed for exploring large data sets. This paper presents the multivariate techniques and reports a number of applications to genotoxicity problems. These studies show how biology and mathematical modeling can be combined and how successful this combination is.
2014-01-01
Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304
NASA Astrophysics Data System (ADS)
Anggraeni, Anni; Arianto, Fernando; Mutalib, Abdul; Pratomo, Uji; Bahti, Husein H.
2017-05-01
Rare Earth Elements (REE) are elements that a lot of function for life, such as metallurgy, optical devices, and manufacture of electronic devices. Sources of REE is present in the mineral, in which each element has similar properties. Currently, to determining the content of REE is used instruments such as ICP-OES, ICP-MS, XRF, and HPLC. But in each instruments, there are still have some weaknesses. Therefore we need an alternative analytical method for the determination of rare earth metal content, one of them is by a combination of UV-Visible spectrophotometry and multivariate analysis, including Principal Component Analysis (PCA), Principal Component Regression (PCR), and Partial Least Square Regression (PLS). The purpose of this experiment is to determine the content of light and medium rare earth elements in the mineral monazite without chemical separation by using a combination of multivariate analysis and UV-Visible spectrophotometric methods. Training set created 22 variations of concentration and absorbance was measured using a UV-Vis spectrophotometer, then the data is processed by PCA, PCR, and PLSR. The results were compared and validated to obtain the mathematical equation with the smallest percent error. From this experiment, mathematical equation used PLS methods was better than PCR after validated, which has RMSE value for La, Ce, Pr, Nd, Gd, Sm, Eu, and Tb respectively 0.095; 0.573; 0.538; 0.440; 3.387; 1.240; 1.870; and 0.639.
Anderson, Annette Carola; Hellwig, Elmar; Vespermann, Robin; Wittmer, Annette; Schmid, Michael; Karygianni, Lamprini; Al-Ahmad, Ali
2012-01-01
Persistence of microorganisms or reinfections are the main reasons for failure of root canal therapy. Very few studies to date have included culture-independent methods to assess the microbiota, including non-cultivable microorganisms. The aim of this study was to combine culture methods with culture-independent cloning methods to analyze the microbial flora of root-filled teeth with periradicular lesions. Twenty-one samples from previously root-filled teeth were collected from patients with periradicular lesions. Microorganisms were cultivated, isolated and biochemically identified. In addition, ribosomal DNA of bacteria, fungi and archaea derived from the same samples was amplified and the PCR products were used to construct clone libraries. DNA of selected clones was sequenced and microbial species were identified, comparing the sequences with public databases. Microorganisms were found in 12 samples with culture-dependent and -independent methods combined. The number of bacterial species ranged from 1 to 12 in one sample. The majority of the 26 taxa belonged to the phylum Firmicutes (14 taxa), followed by Actinobacteria, Proteobacteria and Bacteroidetes. One sample was positive for fungi, and archaea could not be detected. The results obtained with both methods differed. The cloning technique detected several as-yet-uncultivated taxa. Using a combination of both methods 13 taxa were detected that had not been found in root-filled teeth so far. Enterococcus faecalis was only detected in two samples using culture methods. Combining the culture-dependent and –independent approaches revealed new candidate endodontic pathogens and a high diversity of the microbial flora in root-filled teeth with periradicular lesions. Both methods yielded differing results, emphasizing the benefit of combined methods for the detection of the actual microbial diversity in apical periodontitis. PMID:23152922
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-01-01
Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. PMID:28062603
Application of p-Multigrid to Discontinuous Galerkin Formulations of the Poisson Equation
NASA Technical Reports Server (NTRS)
Helenbrook, B. T.; Atkins, H. L.
2006-01-01
We investigate p-multigrid as a solution method for several different discontinuous Galerkin (DG) formulations of the Poisson equation. Different combinations of relaxation schemes and basis sets have been combined with the DG formulations to find the best performing combination. The damping factors of the schemes have been determined using Fourier analysis for both one and two-dimensional problems. One important finding is that when using DG formulations, the standard approach of forming the coarse p matrices separately for each level of multigrid is often unstable. To ensure stability the coarse p matrices must be constructed from the fine grid matrices using algebraic multigrid techniques. Of the relaxation schemes, we find that the combination of Jacobi relaxation with the spectral element basis is fairly effective. The results using this combination are p sensitive in both one and two dimensions, but reasonable convergence rates can still be achieved for moderate values of p and isotropic meshes. A competitive alternative is a block Gauss-Seidel relaxation. This actually out performs a more expensive line relaxation when the mesh is isotropic. When the mesh becomes highly anisotropic, the implicit line method and the Gauss-Seidel implicit line method are the only effective schemes. Adding the Gauss-Seidel terms to the implicit line method gives a significant improvement over the line relaxation method.
Jiang, Gang; Quan, Hong; Wang, Cheng; Gong, Qiyong
2012-12-01
In this paper, a new method of combining translation invariant (TI) and wavelet-threshold (WT) algorithm to distinguish weak and overlapping signals of proton magnetic resonance spectroscopy (1H-MRS) is presented. First, the 1H-MRS spectrum signal is transformed into wavelet domain and then its wavelet coefficients are obtained. Then, the TI method and WT method are applied to detect the weak signals overlapped by the strong ones. Through the analysis of the simulation data, we can see that both frequency and amplitude information of small-signals can be obtained accurately by the algorithm, and through the combination with the method of signal fitting, quantitative calculation of the area under weak signals peaks can be realized.
Elhaj, Fatin A; Salim, Naomie; Harris, Arief R; Swee, Tan Tian; Ahmed, Taqwa
2016-04-01
Arrhythmia is a cardiac condition caused by abnormal electrical activity of the heart, and an electrocardiogram (ECG) is the non-invasive method used to detect arrhythmias or heart abnormalities. Due to the presence of noise, the non-stationary nature of the ECG signal (i.e. the changing morphology of the ECG signal with respect to time) and the irregularity of the heartbeat, physicians face difficulties in the diagnosis of arrhythmias. The computer-aided analysis of ECG results assists physicians to detect cardiovascular diseases. The development of many existing arrhythmia systems has depended on the findings from linear experiments on ECG data which achieve high performance on noise-free data. However, nonlinear experiments characterize the ECG signal more effectively sense, extract hidden information in the ECG signal, and achieve good performance under noisy conditions. This paper investigates the representation ability of linear and nonlinear features and proposes a combination of such features in order to improve the classification of ECG data. In this study, five types of beat classes of arrhythmia as recommended by the Association for Advancement of Medical Instrumentation are analyzed: non-ectopic beats (N), supra-ventricular ectopic beats (S), ventricular ectopic beats (V), fusion beats (F) and unclassifiable and paced beats (U). The characterization ability of nonlinear features such as high order statistics and cumulants and nonlinear feature reduction methods such as independent component analysis are combined with linear features, namely, the principal component analysis of discrete wavelet transform coefficients. The features are tested for their ability to differentiate different classes of data using different classifiers, namely, the support vector machine and neural network methods with tenfold cross-validation. Our proposed method is able to classify the N, S, V, F and U arrhythmia classes with high accuracy (98.91%) using a combined support vector machine and radial basis function method. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
Stolarczyk, Mariusz; Hubicka, Urszula; Żuromska-Witek, Barbara; Krzek, Jan
2015-01-01
A new sensitive, simple, rapid, and precise HPLC method with diode array detection has been developed for separation and simultaneous determination of hydrochlorothiazide, furosemide, torasemide, losartane, quinapril, valsartan, spironolactone, and canrenone in combined pharmaceutical dosage forms. The chromatographic analysis of the tested drugs was performed on an ACE C18, 100 Å, 250×4.6 mm, 5 μm particle size column with 0.0.05 M phosphate buffer (pH=3.00)-acetonitrile-methanol (30+20+50 v/v/v) mobile phase at a flow rate of 1.0 mL/min. The column was thermostatted at 25°C. UV detection was performed at 230 nm. Analysis time was 10 min. The elaborated method meets the acceptance criteria for specificity, linearity, sensitivity, accuracy, and precision. The proposed method was successfully applied for the determination of the studied drugs in the selected combined dosage forms.
Methods for evaluating a mature substance abuse prevention/early intervention program.
Becker, L R; Hall, M; Fisher, D A; Miller, T R
2000-05-01
The authors describe methods for work in progress to evaluate four workplace prevention and/or early intervention programs designed to change occupational norms and reduce substance abuse at a major U.S. transportation company. The four programs are an employee assistance program, random drug testing, managed behavioral health care, and a peer-led intervention program. An elaborate mixed-methods evaluation combines data collection and analysis techniques from several traditions. A process-improvement evaluation focuses on the peer-led component to describe its evolution, document the implementation process for those interested in replicating it, and provide information for program improvement. An outcome-assessment evaluation examines impacts of the four programs on job performance measures (e.g., absenteeism, turnover, injury, and disability rates) and includes a cost-offset and employer cost-savings analysis. Issues related to using archival data, combining qualitative and quantitative designs, and working in a corporate environment are discussed.
NASA Astrophysics Data System (ADS)
Sicard, Emeline; Sabatier, Robert; Niel, HéLèNe; Cadier, Eric
2002-12-01
The objective of this paper is to implement an original method for spatial and multivariate data, combining a method of three-way array analysis (STATIS) with geostatistical tools. The variables of interest are the monthly amounts of rainfall in the Nordeste region of Brazil, recorded from 1937 to 1975. The principle of the technique is the calculation of a linear combination of the initial variables, containing a large part of the initial variability and taking into account the spatial dependencies. It is a promising method that is able to analyze triple variability: spatial, seasonal, and interannual. In our case, the first component obtained discriminates a group of rain gauges, corresponding approximately to the Agreste, from all the others. The monthly variables of July and August strongly influence this separation. Furthermore, an annual study brings out the stability of the spatial structure of components calculated for each year.
Combination of Thin Lenses--A Computer Oriented Method.
ERIC Educational Resources Information Center
Flerackers, E. L. M.; And Others
1984-01-01
Suggests a method treating geometric optics using a microcomputer to do the calculations of image formation. Calculations are based on the connection between the composition of lenses and the mathematics of fractional linear equations. Logic of the analysis and an example problem are included. (JM)
User's manual for interfacing a leading edge, vortex rollup program with two linear panel methods
NASA Technical Reports Server (NTRS)
Desilva, B. M. E.; Medan, R. T.
1979-01-01
Sufficient instructions are provided for interfacing the Mangler-Smith, leading edge vortex rollup program with a vortex lattice (POTFAN) method and an advanced higher order, singularity linear analysis for computing the vortex effects for simple canard wing combinations.
Global thermal analysis of air-air cooled motor based on thermal network
NASA Astrophysics Data System (ADS)
Hu, Tian; Leng, Xue; Shen, Li; Liu, Haidong
2018-02-01
The air-air cooled motors with high efficiency, large starting torque, strong overload capacity, low noise, small vibration and other characteristics, are widely used in different department of national industry, but its cooling structure is complex, it requires the motor thermal management technology should be high. The thermal network method is a common method to calculate the temperature field of the motor, it has the advantages of small computation time and short time consuming, it can save a lot of time in the initial design phase of the motor. The domain analysis of air-air cooled motor and its cooler was based on thermal network method, the combined thermal network model was based, the main components of motor internal and external cooler temperature were calculated and analyzed, and the temperature rise test results were compared to verify the correctness of the combined thermal network model, the calculation method can satisfy the need of engineering design, and provide a reference for the initial and optimum design of the motor.
[The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].
Pérol, M; Pérol, D
2004-02-01
Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.
Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data
2014-01-01
Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193
Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2012-11-08
This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.
A single-loop optimization method for reliability analysis with second order uncertainty
NASA Astrophysics Data System (ADS)
Xie, Shaojun; Pan, Baisong; Du, Xiaoping
2015-08-01
Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.
Coding Classroom Interactions for Collective and Individual Engagement
ERIC Educational Resources Information Center
Ryu, Suna; Lombardi, Doug
2015-01-01
This article characterizes "engagement in science learning" from a sociocultural perspective and offers a mixed method approach to measuring engagement that combines critical discourse analysis (CDA) and social network analysis (SNA). Conceptualizing engagement from a sociocultural perspective, the article discusses the advantages of a…
Phantom Effects in Multilevel Compositional Analysis: Problems and Solutions
ERIC Educational Resources Information Center
Pokropek, Artur
2015-01-01
This article combines statistical and applied research perspective showing problems that might arise when measurement error in multilevel compositional effects analysis is ignored. This article focuses on data where independent variables are constructed measures. Simulation studies are conducted evaluating methods that could overcome the…
Sabir, Aryani; Rafi, Mohamad; Darusman, Latifah K
2017-04-15
HPLC fingerprint analysis combined with chemometrics was developed to discriminate between the red and the white rice bran grown in Indonesia. The major component in rice bran is γ-oryzanol which consisted of 4 main compounds, namely cycloartenol ferulate, cyclobranol ferulate, campesterol ferulate and β-sitosterol ferulate. Separation of these four compounds along with other compounds was performed using C18 and methanol-acetonitrile with gradient elution system. By using these intensity variations, principal component and discriminant analysis were performed to discriminate the two samples. Discriminant analysis was successfully discriminated the red from the white rice bran with predictive ability of the model showed a satisfactory classification for the test samples. The results of this study indicated that the developed method was suitable as quality control method for rice bran in terms of identification and discrimination of the red and the white rice bran. Copyright © 2016 Elsevier Ltd. All rights reserved.
Markiewicz-Keszycka, Maria; Casado-Gavalda, Maria P; Cama-Moncunill, Xavier; Cama-Moncunill, Raquel; Dixit, Yash; Cullen, Patrick J; Sullivan, Carl
2018-04-01
Gluten free (GF) diets are prone to mineral deficiency, thus effective monitoring of the elemental composition of GF products is important to ensure a balanced micronutrient diet. The objective of this study was to test the potential of laser-induced breakdown spectroscopy (LIBS) analysis combined with chemometrics for at-line monitoring of ash, potassium and magnesium content of GF flours: tapioca, potato, maize, buckwheat, brown rice and a GF flour mixture. Concentrations of ash, potassium and magnesium were determined with reference methods and LIBS. PCA analysis was performed and presented the potential for discrimination of the six GF flours. For the quantification analysis PLSR models were developed; R 2 cal were 0.99 for magnesium and potassium and 0.97 for ash. The study revealed that LIBS combined with chemometrics is a convenient method to quantify concentrations of ash, potassium and magnesium and present the potential to classify different types of flours. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zang, Guiyan; Tejasvi, Sharma; Ratner, Albert; Lora, Electo Silva
2018-05-01
The Biomass Integrated Gasification Combined Cycle (BIGCC) power system is believed to potentially be a highly efficient way to utilize biomass to generate power. However, there is no comparative study of BIGCC systems that examines all the latest improvements for gasification agents, gas turbine combustion methods, and CO 2 Capture and Storage options. This study examines the impact of recent advancements on BIGCC performance through exergy analysis using Aspen Plus. Results show that the exergy efficiency of these systems is ranged from 22.3% to 37.1%. Furthermore, exergy analysis indicates that the gas turbine with external combustion has relatively high exergy efficiency, and Selexol CO 2 removal method has low exergy destruction. Moreover, the sensitivity analysis shows that the system exergy efficiency is more sensitive to the initial temperature and pressure ratio of the gas turbine, whereas has a relatively weak dependence on the initial temperature and initial pressure of the steam turbine. Copyright © 2018 Elsevier Ltd. All rights reserved.
Oh, Se Yeon; Shin, Hyun Du; Kim, Sung Jean; Hong, Jongki
2008-03-07
A novel analytical method using fast gas chromatography combined with surface acoustic wave sensor (GC/SAW) has been developed for the detection of volatile aroma compounds emanated from lilac blossom (Syringa species: Syringa vulgaris variginata and Syringa dilatata). GC/SAW could detect and quantify various fragrance emitted from lilac blossom, enabling to provide fragrance pattern analysis results. The fragrance pattern analysis could easily characterize the delicate differences in aromas caused by the substantial difference of chemical composition according to different color and shape of petals. Moreover, the method validation of GC/SAW was performed for the purpose of volatile floral actual aroma analysis, achieving a high reproducibility and excellent sensitivity. From the validation results, GC/SAW could serve as an alternative analytical technique for the analysis of volatile floral actual aroma of lilac. In addition, headspace solid-phase microextraction (HS-SPME) GC-MS was employed to further confirm the identification of fragrances emitted from lilac blossom and compared to GC/SAW.
The tolerance of the femoral shaft in combined axial compression and bending loading.
Ivarsson, B Johan; Genovese, Daniel; Crandall, Jeff R; Bolton, James R; Untaroiu, Costin D; Bose, Dipan
2009-11-01
The likelihood of a front seat occupant sustaining a femoral shaft fracture in a frontal crash has traditionally been assessed by an injury criterion relying solely on the axial force in the femur. However, recently published analyses of real world data indicate that femoral shaft fracture occurs at axial loads levels below those found experimentally. One hypothesis attempting to explain this discrepancy suggests that femoral shaft fracture tends to occur as a result of combined axial compression and applied bending. The current study aims to evaluate this hypothesis by investigating how these two loading components interact. Femoral shafts harvested from human cadavers were loaded to failure in axial compression, sagittal plane bending, and combined axial compression and sagittal plane bending. All specimens subjected to bending and combined loading fractured midshaft, whereas the specimens loaded in axial compression demonstrated a variety of failure locations including midshaft and distal end. The interaction between the recorded levels of applied moment and axial compression force at fracture were evaluated using two different analysis methods: fitting of an analytical model to the experimental data and multiple regression analysis. The two analysis methods yielded very similar relationships between applied moment and axial compression force at midshaft fracture. The results indicate that posteroanterior bending reduces the tolerance of the femoral shaft to axial compression and that that this type of combined loading therefore may contribute to the high prevalence of femoral shaft fracture in frontal crashes.
Chen, Yunbo; Zhang, Guijuan; Chen, Xiaoping; Jiang, Xuefeng; Yuan, Naijun; Wang, Yurong; Hao, Xiaoqian
2018-01-01
Objective. To investigate the effects of Jianpi Bushen (JPBS), a traditional Chinese medicine that is used to invigorate the spleen and tonify the kidney, combined with chemotherapy for the treatment of gastric cancer. Methods. Literature retrieval was performed in PubMed, EMBASE, Cochrane Library, MEDLINE, CNKI, Wanfang Data Information Site, and VIP from inception to October 2017. Randomized controlled trials to evaluate the effects of JPBS combined with chemotherapy were identified. The primary reported outcomes were KPS (Karnofsky Performance Status), clinical curative efficiency, immune function, blood system, and nonhematologic system. Review Manager 5.3 (RevMan 5.3) was used for data analysis, and the quality of the studies was also appraised. Results. A total of 26 studies were included with 3098 individuals. The results of the meta-analysis indicated that treatment of gastric cancer with the combination of JPBS and chemotherapy resulted in better outcomes compared to chemotherapy alone. Conclusion. Evidence from the meta-analysis suggested that JPBS combined with chemotherapy has a positive effect on gastric cancer treatment. However, additional rigorously designed and large sample randomized controlled trials are required to confirm the efficacy and safety of this treatment. PMID:29675052
2009-01-01
Background A central task in contemporary biosciences is the identification of biological processes showing response in genome-wide differential gene expression experiments. Two types of analysis are common. Either, one generates an ordered list based on the differential expression values of the probed genes and examines the tail areas of the list for over-representation of various functional classes. Alternatively, one monitors the average differential expression level of genes belonging to a given functional class. So far these two types of method have not been combined. Results We introduce a scoring function, Gene Set Z-score (GSZ), for the analysis of functional class over-representation that combines two previous analysis methods. GSZ encompasses popular functions such as correlation, hypergeometric test, Max-Mean and Random Sets as limiting cases. GSZ is stable against changes in class size as well as across different positions of the analysed gene list in tests with randomized data. GSZ shows the best overall performance in a detailed comparison to popular functions using artificial data. Likewise, GSZ stands out in a cross-validation of methods using split real data. A comparison of empirical p-values further shows a strong difference in favour of GSZ, which clearly reports better p-values for top classes than the other methods. Furthermore, GSZ detects relevant biological themes that are missed by the other methods. These observations also hold when comparing GSZ with popular program packages. Conclusion GSZ and improved versions of earlier methods are a useful contribution to the analysis of differential gene expression. The methods and supplementary material are available from the website http://ekhidna.biocenter.helsinki.fi/users/petri/public/GSZ/GSZscore.html. PMID:19775443
Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia
2015-01-01
The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis. PMID:25831054
Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia
2015-01-01
The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis.
Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2013-01-15
A fingerprinting approach was developed by means of UPLC-ESI/MS(n) (ultra-performance liquid chromatography-electrospray ionization/mass spectrometry) for the quality control of processed Radix Aconiti, a widely used toxic traditional herbal medicine. The present fingerprinting approach was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity and ensuring the clinical therapeutic efficacy. Similarity evaluation, hierarchical cluster analysis and principal component analysis were performed to evaluate the similarity and variation of the samples. The results showed that the well processed, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to the contents of their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines. Finally, the UPLC-UV and UPLC-ESI/MS(n) characteristic fingerprints were established according to the well processed and purchased qualified samples. At the same time, a complementary quantification method of six Aconitine-type alkaloids was developed using UPLC-UV and UPLC-ESI/MS. The average recovery of the monoester diterpenoid aconitines was 95.4-99.1% and the average recovery of the diester diterpenoid aconitines was 103-112%. The proposed combined quantification method by UPLC-UV and UPLC-ESI/MS allows the samples analyzed in a wide concentration range. Therefore, the established fingerprinting approach in combination with chemometric analysis provides a flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.
Combining multiple tools outperforms individual methods in gene set enrichment analyses.
Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E
2017-02-01
Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Stress analysis method for clearance-fit joints with bearing-bypass loads
NASA Technical Reports Server (NTRS)
Naik, R. A.; Crews, J. H., Jr.
1989-01-01
Within a multi-fastener joint, fastener holes may be subjected to the combined effects of bearing loads and loads that bypass the hole to be reacted elsewhere in the joint. The analysis of a joint subjected to search combined bearing and bypass loads is complicated by the usual clearance between the hole and the fastener. A simple analysis method for such clearance-fit joints subjected to bearing-bypass loading has been developed in the present study. It uses an inverse formulation with a linear elastic finite-element analysis. Conditions along the bolt-hole contact arc are specified by displacement constraint equations. The present method is simple to apply and can be implemented with most general purpose finite-element programs since it does not use complicated iterative-incremental procedures. The method was used to study the effects of bearing-bypass loading on bolt-hole contact angles and local stresses. In this study, a rigid, frictionless bolt was used with a plate having the properties of a quasi-isotropic graphite/epoxy laminate. Results showed that the contact angle as well as the peak stresses around the hole and their locations were strongly influenced by the ratio of bearing and bypass loads. For single contact, tension and compression bearing-bypass loading had opposite effects on the contact angle. For some compressive bearing-bypass loads, the hole tended to close on the fastener leading to dual contact. It was shown that dual contact reduces the stress concentration at the fastener and would, therefore, increase joint strength in compression. The results illustrate the general importance of accounting for bolt-hole clearance and contact to accurately compute local bolt-hole stresses for combined bearings and bypass loading.
Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science
NASA Astrophysics Data System (ADS)
Emadzadeh, Ehsan
Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Duemichen, E; Braun, U; Senz, R; Fabian, G; Sturm, H
2014-08-08
For analysis of the gaseous thermal decomposition products of polymers, the common techniques are thermogravimetry, combined with Fourier transformed infrared spectroscopy (TGA-FTIR) and mass spectrometry (TGA-MS). These methods offer a simple approach to the decomposition mechanism, especially for small decomposition molecules. Complex spectra of gaseous mixtures are very often hard to identify because of overlapping signals. In this paper a new method is described to adsorb the decomposition products during controlled conditions in TGA on solid-phase extraction (SPE) material: twisters. Subsequently the twisters were analysed with thermal desorption gas chromatography mass spectrometry (TDS-GC-MS), which allows the decomposition products to be separated and identified using an MS library. The thermoplastics polyamide 66 (PA 66) and polybutylene terephthalate (PBT) were used as example polymers. The influence of the sample mass and of the purge gas flow during the decomposition process was investigated in TGA. The advantages and limitations of the method were presented in comparison to the common analysis techniques, TGA-FTIR and TGA-MS. Copyright © 2014 Elsevier B.V. All rights reserved.
Estimating carnivoran diets using a combination of carcass observations and scats from GPS clusters
Tambling, C.J.; Laurence, S.D.; Bellan, S.E.; Cameron, E.Z.; du Toit, J.T.; Getz, W.M.
2011-01-01
Scat analysis is one of the most frequently used methods to assess carnivoran diets and Global Positioning System (GPS) cluster methods are increasingly being used to locate feeding sites for large carnivorans. However, both methods have inherent biases that limit their use. GPS methods to locate kill sites are biased towards large carcasses, while scat analysis over-estimates the biomass consumed from smaller prey. We combined carcass observations and scats collected along known movement routes, assessed using GPS data from four African lion (Panthera leo) prides in the Kruger National Park, South Africa, to determine how a combination of these two datasets change diet estimates. As expected, using carcasses alone under-estimated the number of feeding events on small species, primarily impala (Aepyceros melampus) and warthog (Phacochoerus africanus), in our case by more than 50% and thus significantly under-estimated the biomass consumed per pride per day in comparison to when the diet was assessed using carcass observations alone. We show that an approach that supplements carcass observations with scats that enables the identification of potentially missed feeding events increases the estimates of food intake rates for large carnivorans, with possible ramifications for predator-prey interaction studies dealing with biomass intake rate. PMID:22408290
Waskitho, Dri; Lukitaningsih, Endang; Sudjadi; Rohman, Abdul
2016-01-01
Analysis of lard extracted from lipstick formulation containing castor oil has been performed using FTIR spectroscopic method combined with multivariate calibration. Three different extraction methods were compared, namely saponification method followed by liquid/liquid extraction with hexane/dichlorometane/ethanol/water, saponification method followed by liquid/liquid extraction with dichloromethane/ethanol/water, and Bligh & Dyer method using chloroform/methanol/water as extracting solvent. Qualitative and quantitative analysis of lard were performed using principle component (PCA) and partial least square (PLS) analysis, respectively. The results showed that, in all samples prepared by the three extraction methods, PCA was capable of identifying lard at wavelength region of 1200-800 cm -1 with the best result was obtained by Bligh & Dyer method. Furthermore, PLS analysis at the same wavelength region used for qualification showed that Bligh and Dyer was the most suitable extraction method with the highest determination coefficient (R 2 ) and the lowest root mean square error of calibration (RMSEC) as well as root mean square error of prediction (RMSEP) values.
Weighted combination of LOD values oa splitted into frequency windows
NASA Astrophysics Data System (ADS)
Fernandez, L. I.; Gambis, D.; Arias, E. F.
In this analysis a one-day combined time series of LOD(length-of-day) estimates is presented. We use individual data series derived by 7 GPS and 3 SLR analysis centers, which routinely contribute to the IERS database over a recent 27-month period (Jul 1996 - Oct 1998). The result is compared to the multi-technique combined series C04 produced by the Central Bureau of the IERS that is commonly used as a reference for the study of the phenomena of Earth rotation variations. The Frequency Windows Combined Series procedure brings out a time series, which is close to C04 but shows an amplitude difference that might explain the evident periodic behavior present in the differences of these two combined series. This method could be useful to generate a new time series to be used as a reference in the high frequency variations of the Earth rotation studies.
Data Mining for Anomaly Detection
NASA Technical Reports Server (NTRS)
Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj
2013-01-01
The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.
Zhang, Jiang; Liu, Qi; Chen, Huafu; Yuan, Zhen; Huang, Jin; Deng, Lihua; Lu, Fengmei; Zhang, Junpeng; Wang, Yuqing; Wang, Mingwen; Chen, Liangyin
2015-01-01
Clustering analysis methods have been widely applied to identifying the functional brain networks of a multitask paradigm. However, the previously used clustering analysis techniques are computationally expensive and thus impractical for clinical applications. In this study a novel method, called SOM-SAPC that combines self-organizing mapping (SOM) and supervised affinity propagation clustering (SAPC), is proposed and implemented to identify the motor execution (ME) and motor imagery (MI) networks. In SOM-SAPC, SOM was first performed to process fMRI data and SAPC is further utilized for clustering the patterns of functional networks. As a result, SOM-SAPC is able to significantly reduce the computational cost for brain network analysis. Simulation and clinical tests involving ME and MI were conducted based on SOM-SAPC, and the analysis results indicated that functional brain networks were clearly identified with different response patterns and reduced computational cost. In particular, three activation clusters were clearly revealed, which include parts of the visual, ME and MI functional networks. These findings validated that SOM-SAPC is an effective and robust method to analyze the fMRI data with multitasks.
NASA Astrophysics Data System (ADS)
Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.
2017-03-01
A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.
Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne
2016-01-01
Background Twitter’s 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. Objective The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. Methods We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. Conclusions We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts. PMID:26957477
The integrative review: updated methodology.
Whittemore, Robin; Knafl, Kathleen
2005-12-01
The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.
Darwish, Hany W; Bakheit, Ahmed H; Naguib, Ibrahim A
2016-01-01
This paper presents novel methods for spectrophotometric determination of ascorbic acid (AA) in presence of rutin (RU) (coformulated drug) in their combined pharmaceutical formulation. The seven methods are ratio difference (RD), isoabsorptive_RD (Iso_RD), amplitude summation (A_Sum), isoabsorptive point, first derivative of the ratio spectra ((1)DD), mean centering (MCN), and ratio subtraction (RS). On the other hand, RU was determined directly by measuring the absorbance at 358 nm in addition to the two novel Iso_RD and A_Sum methods. The work introduced in this paper aims to compare these different methods, showing the advantages for each and making a comparison of analysis results. The calibration curve is linear over the concentration range of 4-50 μg/mL for AA and RU. The results show the high performance of proposed methods for the analysis of the binary mixture. The optimum assay conditions were established and the proposed methods were successfully applied for the assay of the two drugs in laboratory prepared mixtures and combined pharmaceutical tablets with excellent recoveries. No interference was observed from common pharmaceutical additives.
Darwish, Hany W.; Bakheit, Ahmed H.; Naguib, Ibrahim A.
2016-01-01
This paper presents novel methods for spectrophotometric determination of ascorbic acid (AA) in presence of rutin (RU) (coformulated drug) in their combined pharmaceutical formulation. The seven methods are ratio difference (RD), isoabsorptive_RD (Iso_RD), amplitude summation (A_Sum), isoabsorptive point, first derivative of the ratio spectra (1DD), mean centering (MCN), and ratio subtraction (RS). On the other hand, RU was determined directly by measuring the absorbance at 358 nm in addition to the two novel Iso_RD and A_Sum methods. The work introduced in this paper aims to compare these different methods, showing the advantages for each and making a comparison of analysis results. The calibration curve is linear over the concentration range of 4–50 μg/mL for AA and RU. The results show the high performance of proposed methods for the analysis of the binary mixture. The optimum assay conditions were established and the proposed methods were successfully applied for the assay of the two drugs in laboratory prepared mixtures and combined pharmaceutical tablets with excellent recoveries. No interference was observed from common pharmaceutical additives. PMID:26885440
Decomposing the Apoptosis Pathway Into Biologically Interpretable Principal Components
Wang, Min; Kornblau, Steven M; Coombes, Kevin R
2018-01-01
Principal component analysis (PCA) is one of the most common techniques in the analysis of biological data sets, but applying PCA raises 2 challenges. First, one must determine the number of significant principal components (PCs). Second, because each PC is a linear combination of genes, it rarely has a biological interpretation. Existing methods to determine the number of PCs are either subjective or computationally extensive. We review several methods and describe a new R package, PCDimension, that implements additional methods, the most important being an algorithm that extends and automates a graphical Bayesian method. Using simulations, we compared the methods. Our newly automated procedure is competitive with the best methods when considering both accuracy and speed and is the most accurate when the number of objects is small compared with the number of attributes. We applied the method to a proteomics data set from patients with acute myeloid leukemia. Proteins in the apoptosis pathway could be explained using 6 PCs. By clustering the proteins in PC space, we were able to replace the PCs by 6 “biological components,” 3 of which could be immediately interpreted from the current literature. We expect this approach combining PCA with clustering to be widely applicable. PMID:29881252
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandford, M.T. II; Bradley, J.N.; Handel, T.G.
Data embedding is a new steganographic method for combining digital information sets. This paper describes the data embedding method and gives examples of its application using software written in the C-programming language. Sandford and Handel produced a computer program (BMPEMBED, Ver. 1.51 written for IBM PC/AT or compatible, MS/DOS Ver. 3.3 or later) that implements data embedding in an application for digital imagery. Information is embedded into, and extracted from, Truecolor or color-pallet images in Microsoft{reg_sign} bitmap (.BMP) format. Hiding data in the noise component of a host, by means of an algorithm that modifies or replaces the noise bits,more » is termed {open_quote}steganography.{close_quote} Data embedding differs markedly from conventional steganography, because it uses the noise component of the host to insert information with few or no modifications to the host data values or their statistical properties. Consequently, the entropy of the host data is affected little by using data embedding to add information. The data embedding method applies to host data compressed with transform, or {open_quote}lossy{close_quote} compression algorithms, as for example ones based on discrete cosine transform and wavelet functions. Analysis of the host noise generates a key required for embedding and extracting the auxiliary data from the combined data. The key is stored easily in the combined data. Images without the key cannot be processed to extract the embedded information. To provide security for the embedded data, one can remove the key from the combined data and manage it separately. The image key can be encrypted and stored in the combined data or transmitted separately as a ciphertext much smaller in size than the embedded data. The key size is typically ten to one-hundred bytes, and it is in data an analysis algorithm.« less
NASA Astrophysics Data System (ADS)
Sandford, Maxwell T., II; Bradley, Jonathan N.; Handel, Theodore G.
1996-01-01
Data embedding is a new steganographic method for combining digital information sets. This paper describes the data embedding method and gives examples of its application using software written in the C-programming language. Sandford and Handel produced a computer program (BMPEMBED, Ver. 1.51 written for IBM PC/AT or compatible, MS/DOS Ver. 3.3 or later) that implements data embedding in an application for digital imagery. Information is embedded into, and extracted from, Truecolor or color-pallet images in MicrosoftTM bitmap (BMP) format. Hiding data in the noise component of a host, by means of an algorithm that modifies or replaces the noise bits, is termed `steganography.' Data embedding differs markedly from conventional steganography, because it uses the noise component of the host to insert information with few or no modifications to the host data values or their statistical properties. Consequently, the entropy of the host data is affected little by using data embedding to add information. The data embedding method applies to host data compressed with transform, or `lossy' compression algorithms, as for example ones based on discrete cosine transform and wavelet functions. Analysis of the host noise generates a key required for embedding and extracting the auxiliary data from the combined data. The key is stored easily in the combined data. Images without the key cannot be processed to extract the embedded information. To provide security for the embedded data, one can remove the key from the combined data and manage it separately. The image key can be encrypted and stored in the combined data or transmitted separately as a ciphertext much smaller in size than the embedded data. The key size is typically ten to one-hundred bytes, and it is derived from the original host data by an analysis algorithm.
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1992-01-01
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Geneletti, Davide
2010-02-01
This paper presents a method based on the combination of stakeholder analysis and spatial multicriteria evaluation (SMCE) to first design possible sites for an inert landfill, and then rank them according to their suitability. The method was tested for the siting of an inert landfill in the Sarca's Plain, located in south-western Trentino, an alpine region in northern Italy. Firstly, stakeholder analysis was conducted to identify a set of criteria to be satisfied by new inert landfill sites. SMCE techniques were then applied to combine the criteria, and obtain a suitability map of the study region. Subsequently, the most suitable sites were extracted by taking into account also thresholds based on size and shape. These sites were then compared and ranked according to their visibility, accessibility and dust pollution. All these criteria were assessed through GIS modelling. Sensitivity analyses were performed on the results to assess the stability of the ranking with respect to variations in the input (criterion scores and weights). The study concluded that the three top-ranking sites are located close to each other, in the northernmost sector of the study area. A more general finding was that the use of different criteria in the different stages of the analysis allowed to better differentiate the suitability of the potential landfill sites.
[Vision-astigmatometer and methods of its use].
Dashevskiĭ, A I; Kirrilov, Iu A
1991-01-01
A combination of astigmatic figures with black strips in different directions every 45 degrees and of two mutually perpendicular figures combined with an angle on a rotating disk on the front side of the astigmatometer and a combination of an angle and visometric cross of Landolt's optotypes on its back side with the similar disk, and a table of optotypes on the same side is suggested, that was tried in clinic. The directions of optotype ring ruptures are situated in 8 meridians. The front side of the astigmatometer shows a scheme for vector analysis of lenticular astigmatism. The method employed by the authors simplifies and accelerates the investigation, making unnecessary clouding and use of cross cylinders.
Cheng, Ting; Nebel, Oliver; Sossi, Paolo A.; Chen, Fukun
2014-01-01
A combined procedure for separating Fe and Hf from a single rock digestion is presented. In a two-stage chromatographic extraction process, a purified Fe fraction is first quantitatively separated from the rock matrix using AG-MP-1M resin in HCl. Hafnium is subsequently isolated using a modified version of a commonly applied method using Eichrom LN-Spec resin. Our combined method includes:•Purification of Fe from the rock matrix using HCl, ready for mass spectrometric analysis.•Direct loading of the matrix onto the resin that is used for Hf purification.•Collection of a Fe-free Hf fraction. PMID:26150946
Structural-change localization and monitoring through a perturbation-based inverse problem.
Roux, Philippe; Guéguen, Philippe; Baillet, Laurent; Hamze, Alaa
2014-11-01
Structural-change detection and characterization, or structural-health monitoring, is generally based on modal analysis, for detection, localization, and quantification of changes in structure. Classical methods combine both variations in frequencies and mode shapes, which require accurate and spatially distributed measurements. In this study, the detection and localization of a local perturbation are assessed by analysis of frequency changes (in the fundamental mode and overtones) that are combined with a perturbation-based linear inverse method and a deconvolution process. This perturbation method is applied first to a bending beam with the change considered as a local perturbation of the Young's modulus, using a one-dimensional finite-element model for modal analysis. Localization is successful, even for extended and multiple changes. In a second step, the method is numerically tested under ambient-noise vibration from the beam support with local changes that are shifted step by step along the beam. The frequency values are revealed using the random decrement technique that is applied to the time-evolving vibrations recorded by one sensor at the free extremity of the beam. Finally, the inversion method is experimentally demonstrated at the laboratory scale with data recorded at the free end of a Plexiglas beam attached to a metallic support.
Atmospheric Transformation of Volatile Organic Compounds
2008-03-01
Study Analysis Reactant mixtures and standards from product identification experiments were sampled by exposing a 100% polydimethylsiloxane solid...later using the DNPH derivatization method described above and confirmed against a commercial standard. HPLC analysis of the DNPH cartridges also...reaction mixture for a combined total photolysis time ofapproximately 50 seconds. 2.3. Kinetic Study Analysis Samples from kinetic studies were
ERIC Educational Resources Information Center
Pence, Brian Wells; Miller, William C.; Gaynes, Bradley N.
2009-01-01
Prevalence and validation studies rely on imperfect reference standard (RS) diagnostic instruments that can bias prevalence and test characteristic estimates. The authors illustrate 2 methods to account for RS misclassification. Latent class analysis (LCA) combines information from multiple imperfect measures of an unmeasurable latent condition to…
Reduction method with system analysis for multiobjective optimization-based design
NASA Technical Reports Server (NTRS)
Azarm, S.; Sobieszczanski-Sobieski, J.
1993-01-01
An approach for reducing the number of variables and constraints, which is combined with System Analysis Equations (SAE), for multiobjective optimization-based design is presented. In order to develop a simplified analysis model, the SAE is computed outside an optimization loop and then approximated for use by an operator. Two examples are presented to demonstrate the approach.
Analyzing Students' Learning in Classroom Discussions about Socioscientific Issues
ERIC Educational Resources Information Center
Rudsberg, Karin; Ohman, Johan; Ostman, Leif
2013-01-01
In this study, the purpose is to develop and illustrate a method that facilitates investigations of students' learning processes in classroom discussions about socioscientific issues. The method, called transactional argumentation analysis, combines a transactional perspective on meaning making based on John Dewey's pragmatic philosophy and an…
Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini
2014-01-01
Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Sato, Kae; Sato, Kiichi; Okubo, Akira; Yamazaki, Sunao
2005-01-01
A capillary electrophoresis method was developed for the analysis of oligosaccharides combined with derivatization with 2-aminobenzoic acid. Glycosaminoglycan delta-disaccharides were effectively resolved on a fused-silica capillary tube using 150 mM borate, pH 8.5, as a running electrolyte solution. This analytical method was applied to the identification of glycosaminoglycan in combination with enzymatic digestion. The separation of N-glycans or glucose-oligomers was performed with a phosphate buffer containing polyethylene glycol or borate as an electrolyte solution. This method is expected to be useful in the determination of oligosaccharide structures in a glycoprotein.
Numerical Investigation of Laminar-Turbulent Transition in a Flat Plate Wake
1990-03-02
Difference Methods , Oxford University Press. 3 Swarztrauber, P. N. (1977). "The Methods of Cyclic Reduction, Fourier Analysis and The FACR Algorithm for...streamwise and trans- verse directions. For the temporal discretion, a combination of ADI, Crank-Nicolson,Iand Adams-Rashforth methods is employed. The...41 U 5. NUMERICAL METHOD ...... .................... .. 50 3 5.1 Spanwise Spectral Approximation ... .............. ... 50 5.1.1 Fourier
Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...
NESSUS/EXPERT - An expert system for probabilistic structural analysis methods
NASA Technical Reports Server (NTRS)
Millwater, H.; Palmer, K.; Fink, P.
1988-01-01
An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.
NASA Technical Reports Server (NTRS)
Middleton, W. D.; Lundry, J. L.
1975-01-01
An integrated system of computer programs has been developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This part presents a general description of the system and describes the theoretical methods used.
Analysis of multiple pulse NMR in solids. III
NASA Technical Reports Server (NTRS)
Burum, D. P.; Rhim, W. K.
1979-01-01
The paper introduces principles which greatly simplify the process of designing and analyzing compound pulse cycles. These principles are demonstrated by applying them to the design and analysis of several cycles, including a 52-pulse cycle; this pulse cycle combines six different REV-8 cycles and has substantially more resolving power than previously available techniques. Also, a new 24-pulse cycle is introduced which combines three different REV-8 cycles and has a resolving ability equivalent to that of the 52-pulse cycle. The principle of pulse-cycle decoupling provides a method for systematically combining pulse groups into compound cycles in order to achieve enhanced performance. This method is illustrated by a logical development from the two-pulse solid echo sequence to the WAHUHA (Waugh et al., 1968), the REV-8, and the new 24-pulse and 52-pulse cycles, along with the 14-pulse and 12-pulse cycles. Proton chemical shift tensor components for several organic solids, measured by using the 52-pulse cycle, are reported without detailed discussion.
Mean Comparison: Manifest Variable versus Latent Variable
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Bentler, Peter M.
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Wei; Reddy, T. A.; Gurian, Patrick
2007-01-31
A companion paper to Jiang and Reddy that presents a general and computationally efficient methodology for dyanmic scheduling and optimal control of complex primary HVAC&R plants using a deterministic engineering optimization approach.
NASA Astrophysics Data System (ADS)
Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo
2006-08-01
Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.
Measuring User Similarity Using Electric Circuit Analysis: Application to Collaborative Filtering
Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan
2012-01-01
We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user–item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems PMID:23145095
Measuring user similarity using electric circuit analysis: application to collaborative filtering.
Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan
2012-01-01
We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user-item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.
Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.
Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm
Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed
2018-01-01
Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036
New methods for image collection and analysis in scanning Auger microscopy
NASA Technical Reports Server (NTRS)
Browning, R.
1985-01-01
While scanning Auger micrographs are used extensively for illustrating the stoichiometry of complex surfaces and for indicating areas of interest for fine point Auger spectroscopy, there are many problems in the quantification and analysis of Auger images. These problems include multiple contrast mechanisms and the lack of meaningful relationships with other Auger data. Collection of multielemental Auger images allows some new approaches to image analysis and presentation. Information about the distribution and quantity of elemental combinations at a surface are retrievable, and particular combinations of elements can be imaged, such as alloy phases. Results from the precipitate hardened alloy Al-2124 illustrate multispectral Auger imaging.
Zhou, Xu; Wang, Qilin; Jiang, Guangming; Zhang, Xiwang; Yuan, Zhiguo
2014-12-01
Improvement of sludge dewaterability is crucial for reducing the costs of sludge disposal in wastewater treatment plants. This study presents a novel method based on combined conditioning with zero-valent iron (ZVI) and hydrogen peroxide (HP) at pH 2.0 to improve dewaterability of a full-scale waste activated sludge (WAS). The combination of ZVI (0-750mg/L) and HP (0-750mg/L) at pH 2.0 substantially improved the WAS dewaterability due to Fenton-like reactions. The highest improvement in WAS dewaterability was attained at 500mg ZVI/L and 250mg HP/L, when the capillary suction time of the WAS was reduced by approximately 50%. Particle size distribution indicated that the sludge flocs were decomposed after conditioning. Economic analysis showed that combined conditioning with ZVI and HP was a more economically favorable method for improving WAS dewaterability than the classical Fenton reaction based method initiated by ferrous salts and HP. Copyright © 2014 Elsevier Ltd. All rights reserved.
Utility of correlation techniques in gravity and magnetic interpretation
NASA Technical Reports Server (NTRS)
Chandler, V. W.; Koski, J. S.; Braile, L. W.; Hinze, W. J.
1977-01-01
Two methods of quantitative combined analysis, internal correspondence and clustering, are presented. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile show they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wave-length anomalies, and isolating geomagnetic field removal problems. Thus, these techniques are useful in considering regional data acquired by satellites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenberg, M.; Ebel, D.S.
2009-03-19
We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length ofmore » {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.« less
Karunathilaka, Sanjeewa R; Kia, Ali-Reza Fardin; Srigley, Cynthia; Chung, Jin Kyu; Mossoba, Magdi M
2016-10-01
A rapid tool for evaluating authenticity was developed and applied to the screening of extra virgin olive oil (EVOO) retail products by using Fourier-transform near infrared (FT-NIR) spectroscopy in combination with univariate and multivariate data analysis methods. Using disposable glass tubes, spectra for 62 reference EVOO, 10 edible oil adulterants, 20 blends consisting of EVOO spiked with adulterants, 88 retail EVOO products and other test samples were rapidly measured in the transmission mode without any sample preparation. The univariate conformity index (CI) and the multivariate supervised soft independent modeling of class analogy (SIMCA) classification tool were used to analyze the various olive oil products which were tested for authenticity against a library of reference EVOO. Better discrimination between the authentic EVOO and some commercial EVOO products was observed with SIMCA than with CI analysis. Approximately 61% of all EVOO commercial products were flagged by SIMCA analysis, suggesting that further analysis be performed to identify quality issues and/or potential adulterants. Due to its simplicity and speed, FT-NIR spectroscopy in combination with multivariate data analysis can be used as a complementary tool to conventional official methods of analysis to rapidly flag EVOO products that may not belong to the class of authentic EVOO. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Markerless gating for lung cancer radiotherapy based on machine learning techniques
NASA Astrophysics Data System (ADS)
Lin, Tong; Li, Ruijiang; Tang, Xiaoli; Dy, Jennifer G.; Jiang, Steve B.
2009-03-01
In lung cancer radiotherapy, radiation to a mobile target can be delivered by respiratory gating, for which we need to know whether the target is inside or outside a predefined gating window at any time point during the treatment. This can be achieved by tracking one or more fiducial markers implanted inside or near the target, either fluoroscopically or electromagnetically. However, the clinical implementation of marker tracking is limited for lung cancer radiotherapy mainly due to the risk of pneumothorax. Therefore, gating without implanted fiducial markers is a promising clinical direction. We have developed several template-matching methods for fluoroscopic marker-less gating. Recently, we have modeled the gating problem as a binary pattern classification problem, in which principal component analysis (PCA) and support vector machine (SVM) are combined to perform the classification task. Following the same framework, we investigated different combinations of dimensionality reduction techniques (PCA and four nonlinear manifold learning methods) and two machine learning classification methods (artificial neural networks—ANN and SVM). Performance was evaluated on ten fluoroscopic image sequences of nine lung cancer patients. We found that among all combinations of dimensionality reduction techniques and classification methods, PCA combined with either ANN or SVM achieved a better performance than the other nonlinear manifold learning methods. ANN when combined with PCA achieves a better performance than SVM in terms of classification accuracy and recall rate, although the target coverage is similar for the two classification methods. Furthermore, the running time for both ANN and SVM with PCA is within tolerance for real-time applications. Overall, ANN combined with PCA is a better candidate than other combinations we investigated in this work for real-time gated radiotherapy.
Combining multiple ChIP-seq peak detection systems using combinatorial fusion.
Schweikert, Christina; Brown, Stuart; Tang, Zuojian; Smith, Phillip R; Hsu, D Frank
2012-01-01
Due to the recent rapid development in ChIP-seq technologies, which uses high-throughput next-generation DNA sequencing to identify the targets of Chromatin Immunoprecipitation, there is an increasing amount of sequencing data being generated that provides us with greater opportunity to analyze genome-wide protein-DNA interactions. In particular, we are interested in evaluating and enhancing computational and statistical techniques for locating protein binding sites. Many peak detection systems have been developed; in this study, we utilize the following six: CisGenome, MACS, PeakSeq, QuEST, SISSRs, and TRLocator. We define two methods to merge and rescore the regions of two peak detection systems and analyze the performance based on average precision and coverage of transcription start sites. The results indicate that ChIP-seq peak detection can be improved by fusion using score or rank combination. Our method of combination and fusion analysis would provide a means for generic assessment of available technologies and systems and assist researchers in choosing an appropriate system (or fusion method) for analyzing ChIP-seq data. This analysis offers an alternate approach for increasing true positive rates, while decreasing false positive rates and hence improving the ChIP-seq peak identification process.
Time-resolved x-ray scattering instrumentation
Borso, C.S.
1985-11-21
An apparatus and method for increased speed and efficiency of data compilation and analysis in real time is presented in this disclosure. Data is sensed and grouped in combinations in accordance with predetermined logic. The combinations are grouped so that a simplified reduced signal results, such as pairwise summing of data values having offsetting algebraic signs, thereby reducing the magnitude of the net pair sum. Bit storage requirements are reduced and speed of data compilation and analysis is increased by manipulation of shorter bit length data values, making real time evaluation possible.
Application of Higuchi's fractal dimension from basic to clinical neurophysiology: A review.
Kesić, Srdjan; Spasić, Sladjana Z
2016-09-01
For more than 20 years, Higuchi's fractal dimension (HFD), as a nonlinear method, has occupied an important place in the analysis of biological signals. The use of HFD has evolved from EEG and single neuron activity analysis to the most recent application in automated assessments of different clinical conditions. Our objective is to provide an updated review of the HFD method applied in basic and clinical neurophysiological research. This article summarizes and critically reviews a broad literature and major findings concerning the applications of HFD for measuring the complexity of neuronal activity during different neurophysiological conditions. The source of information used in this review comes from the PubMed, Scopus, Google Scholar and IEEE Xplore Digital Library databases. The review process substantiated the significance, advantages and shortcomings of HFD application within all key areas of basic and clinical neurophysiology. Therefore, the paper discusses HFD application alone, combined with other linear or nonlinear measures, or as a part of automated methods for analyzing neurophysiological signals. The speed, accuracy and cost of applying the HFD method for research and medical diagnosis make it stand out from the widely used linear methods. However, only a combination of HFD with other nonlinear methods ensures reliable and accurate analysis of a wide range of neurophysiological signals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Ping; Qi, Chu-Bo; Zhu, Quan-Fei; Yuan, Bi-Feng; Feng, Yu-Qi
2016-02-01
Precursor ion scan and multiple reaction monitoring scan (MRM) are two typical scan modes in mass spectrometry analysis. Here, we developed a strategy by combining stable isotope labeling (IL) with liquid chromatography-mass spectrometry (LC-MS) under double precursor ion scan (DPI) and MRM for analysis of thiols in 5 types of human cancer urine. Firstly, the IL-LC-DPI-MS method was applied for non-targeted profiling of thiols from cancer samples. Compared to traditional full scan mode, the DPI method significantly improved identification selectivity and accuracy. 103 thiol candidates were discovered in all cancers and 6 thiols were identified by their standards. It is worth noting that pantetheine, for the first time, was identified in human urine. Secondly, the IL-LC-MRM-MS method was developed for relative quantification of thiols in cancers compared to healthy controls. All the MRM transitions of light and heavy labeled thiols were acquired from urines by using DPI method. Compared to DPI method, the sensitivity of MRM improved by 2.1-11.3 folds. In addition, the concentration of homocysteine, γ-glutamylcysteine and pantetheine enhanced more than two folds in cancer patients compared to healthy controls. Taken together, the method demonstrated to be a promising strategy for identification and comprehensive quantification of thiols in human urines.
Liu, Ping; Qi, Chu-Bo; Zhu, Quan-Fei; Yuan, Bi-Feng; Feng, Yu-Qi
2016-01-01
Precursor ion scan and multiple reaction monitoring scan (MRM) are two typical scan modes in mass spectrometry analysis. Here, we developed a strategy by combining stable isotope labeling (IL) with liquid chromatography-mass spectrometry (LC-MS) under double precursor ion scan (DPI) and MRM for analysis of thiols in 5 types of human cancer urine. Firstly, the IL-LC-DPI-MS method was applied for non-targeted profiling of thiols from cancer samples. Compared to traditional full scan mode, the DPI method significantly improved identification selectivity and accuracy. 103 thiol candidates were discovered in all cancers and 6 thiols were identified by their standards. It is worth noting that pantetheine, for the first time, was identified in human urine. Secondly, the IL-LC-MRM-MS method was developed for relative quantification of thiols in cancers compared to healthy controls. All the MRM transitions of light and heavy labeled thiols were acquired from urines by using DPI method. Compared to DPI method, the sensitivity of MRM improved by 2.1–11.3 folds. In addition, the concentration of homocysteine, γ-glutamylcysteine and pantetheine enhanced more than two folds in cancer patients compared to healthy controls. Taken together, the method demonstrated to be a promising strategy for identification and comprehensive quantification of thiols in human urines. PMID:26888486
NASA Astrophysics Data System (ADS)
Foster, Hyacinth Carmen
Science educators and administrators support the idea that inquiry-based and didactic-based instructional strategies have varying effects on students' acquisition of science concepts. The research problem addressed whether incorporating the two approaches covered the learning requirements of all students in science classes, enabling them to meet state and national standards. The purpose of this quasiexperimental, posttest design research study was to determine if student learning and achievement in high school biology classes differed for each type of instructional method. Constructivism theory suggested that each learner creates knowledge over time because of the learners' interactions with the environment. The optimal teaching method, didactic (teacher-directed), inquiry-based, or a combination of two approaches instructional method, becomes essential if students are to discover ways to learn information. The research question examined which form of instruction had a significant effect on student achievement in biology. The data analysis consisted of single-factor, independent-measures analysis of variance (ANOVA) that tested the hypotheses of the research study. Locally, the results indicated greater and statistically significant differences in standardized laboratory scores for students who were taught using the combination of two approaches. Based on these results, biology instructors will gain new insights into ways of improving the instructional process. Social change may occur as the science curriculum leadership applies the combination of two instructional approaches to improve acquisition of science concepts by biology students.
Hallisey, Elaine; Tai, Eric; Berens, Andrew; Wilt, Grete; Peipins, Lucy; Lewis, Brian; Graham, Shannon; Flanagan, Barry; Lunsford, Natasha Buchanan
2017-08-07
Transforming spatial data from one scale to another is a challenge in geographic analysis. As part of a larger, primary study to determine a possible association between travel barriers to pediatric cancer facilities and adolescent cancer mortality across the United States, we examined methods to estimate mortality within zones at varying distances from these facilities: (1) geographic centroid assignment, (2) population-weighted centroid assignment, (3) simple areal weighting, (4) combined population and areal weighting, and (5) geostatistical areal interpolation. For the primary study, we used county mortality counts from the National Center for Health Statistics (NCHS) and population data by census tract for the United States to estimate zone mortality. In this paper, to evaluate the five mortality estimation methods, we employed address-level mortality data from the state of Georgia in conjunction with census data. Our objective here is to identify the simplest method that returns accurate mortality estimates. The distribution of Georgia county adolescent cancer mortality counts mirrors the Poisson distribution of the NCHS counts for the U.S. Likewise, zone value patterns, along with the error measures of hierarchy and fit, are similar for the state and the nation. Therefore, Georgia data are suitable for methods testing. The mean absolute value arithmetic differences between the observed counts for Georgia and the five methods were 5.50, 5.00, 4.17, 2.74, and 3.43, respectively. Comparing the methods through paired t-tests of absolute value arithmetic differences showed no statistical difference among the methods. However, we found a strong positive correlation (r = 0.63) between estimated Georgia mortality rates and combined weighting rates at zone level. Most importantly, Bland-Altman plots indicated acceptable agreement between paired arithmetic differences of Georgia rates and combined population and areal weighting rates. This research contributes to the literature on areal interpolation, demonstrating that combined population and areal weighting, compared to other tested methods, returns the most accurate estimates of mortality in transforming small counts by county to aggregated counts for large, non-standard study zones. This conceptually simple cartographic method should be of interest to public health practitioners and researchers limited to analysis of data for relatively large enumeration units.
Wang, Xinyu; Gao, Jing-Lin; Du, Chaohui; An, Jing; Li, MengJiao; Ma, Haiyan; Zhang, Lina; Jiang, Ye
2017-01-01
People today have a stronger interest in the risk of biosafety in clinical bioanalysis. A safe, simple, effective method of preparation is needed urgently. To improve biosafety of clinical analysis, we used antiviral drugs of adefovir and tenofovir as model drugs and developed a safe pretreatment method combining sealing technique with direct injection technique. The inter- and intraday precision (RSD %) of the method were <4%, and the extraction recoveries ranged from 99.4 to 100.7%. Meanwhile, the results showed that standard solution could be used to prepare calibration curve instead of spiking plasma, acquiring more accuracy result. Compared with traditional methods, the novel method not only improved biosecurity of the pretreatment method significantly, but also achieved several advantages including higher precision, favorable sensitivity and satisfactory recovery. With these highly practical and desirable characteristics, the novel method may become a feasible platform in bioanalysis.
SELF-ORGANIZING MAPS FOR INTEGRATED ASSESSMENT OF THE MID-ATLANTIC REGION
A. new method was developed to perform an environmental assessment for the
Mid-Atlantic Region (MAR). This was a combination of the self-organizing map (SOM) neural network and principal component analysis (PCA). The method is capable of clustering ecosystems in terms of envi...
Probabilistic finite elements for transient analysis in nonlinear continua
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Mani, A.
1985-01-01
The probabilistic finite element method (PFEM), which is a combination of finite element methods and second-moment analysis, is formulated for linear and nonlinear continua with inhomogeneous random fields. Analogous to the discretization of the displacement field in finite element methods, the random field is also discretized. The formulation is simplified by transforming the correlated variables to a set of uncorrelated variables through an eigenvalue orthogonalization. Furthermore, it is shown that a reduced set of the uncorrelated variables is sufficient for the second-moment analysis. Based on the linear formulation of the PFEM, the method is then extended to transient analysis in nonlinear continua. The accuracy and efficiency of the method is demonstrated by application to a one-dimensional, elastic/plastic wave propagation problem. The moments calculated compare favorably with those obtained by Monte Carlo simulation. Also, the procedure is amenable to implementation in deterministic FEM based computer programs.
Cest Analysis: Automated Change Detection from Very-High Remote Sensing Images
NASA Astrophysics Data System (ADS)
Ehlers, M.; Klonus, S.; Jarmer, T.; Sofina, N.; Michel, U.; Reinartz, P.; Sirmacek, B.
2012-08-01
A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye) new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST) analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT) and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment) with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST) of the change algorithms is applied to calculate the probability of change for a particular location. CEST was tested with high-resolution satellite images of the crisis areas of Darfur (Sudan). CEST results are compared with a number of standard algorithms for automated change detection such as image difference, image ratioe, principal component analysis, delta cue technique and post classification change detection. The new combined method shows superior results averaging between 45% and 15% improvement in accuracy.
NASA Technical Reports Server (NTRS)
Reddy, C. J.; Deshpande, M. D.; Cockrell, C. R.; Beck, F. B.
1995-01-01
A combined finite element method (FEM) and method of moments (MoM) technique is presented to analyze the radiation characteristics of a cavity-fed aperture in three dimensions. Generalized feed modeling has been done using the modal expansion of fields in the feed structure. Numerical results for some feeding structures such as a rectangular waveguide, circular waveguide, and coaxial line are presented. The method also uses the geometrical theory of diffraction (GTD) to predict the effect of a finite ground plane on radiation characteristics. Input admittance calculations for open radiating structures such as a rectangular waveguide, a circular waveguide, and a coaxial line are shown. Numerical data for a coaxial-fed cavity with finite ground plane are verified with experimental data.
NASA Astrophysics Data System (ADS)
Subedi, Kiran; Trejos, Tatiana; Almirall, José
2015-01-01
Elemental analysis, using either LA-ICP-MS or LIBS, can be used for the chemical characterization of materials of forensic interest to discriminate between source materials originating from different sources and also for the association of materials known to originate from the same source. In this study, a tandem LIBS/LA-ICP-MS system that combines the benefits of both LIBS and LA-ICP-MS was evaluated for the characterization of samples of printing inks (toners, inkjets, intaglio and offset.). The performance of both laser sampling methods is presented. A subset of 9 black laser toners, 10 colored (CMYK) inkjet samples, 12 colored (CMYK) offset samples and 12 intaglio inks originating from different manufacturing sources were analyzed to evaluate the discrimination capability of the tandem method. These samples were selected because they presented a very similar elemental profile by LA-ICP-MS. Although typical discrimination between different ink sources is found to be > 99% for a variety of inks when only LA-ICP-MS was used for the analysis, additional discrimination was achieved by combining the elemental results from the LIBS analysis to the LA-ICP-MS analysis in the tandem technique, enhancing the overall discrimination capability of the individual laser ablation methods. The LIBS measurements of the Ca, Fe, K and Si signals, in particular, improved the discrimination for this specific set of different ink samples previously shown to exhibit very similar LA-ICP-MS elemental profiles. The combination of these two techniques in a single setup resulted in better discrimination of the printing inks with two distinct fingerprint spectra, providing information from atomic/ionic emissions and isotopic composition (m/z) for each ink sample.
Single Channel EEG Artifact Identification Using Two-Dimensional Multi-Resolution Analysis.
Taherisadr, Mojtaba; Dehzangi, Omid; Parsaei, Hossein
2017-12-13
As a diagnostic monitoring approach, electroencephalogram (EEG) signals can be decoded by signal processing methodologies for various health monitoring purposes. However, EEG recordings are contaminated by other interferences, particularly facial and ocular artifacts generated by the user. This is specifically an issue during continuous EEG recording sessions, and is therefore a key step in using EEG signals for either physiological monitoring and diagnosis or brain-computer interface to identify such artifacts from useful EEG components. In this study, we aim to design a new generic framework in order to process and characterize EEG recording as a multi-component and non-stationary signal with the aim of localizing and identifying its component (e.g., artifact). In the proposed method, we gather three complementary algorithms together to enhance the efficiency of the system. Algorithms include time-frequency (TF) analysis and representation, two-dimensional multi-resolution analysis (2D MRA), and feature extraction and classification. Then, a combination of spectro-temporal and geometric features are extracted by combining key instantaneous TF space descriptors, which enables the system to characterize the non-stationarities in the EEG dynamics. We fit a curvelet transform (as a MRA method) to 2D TF representation of EEG segments to decompose the given space to various levels of resolution. Such a decomposition efficiently improves the analysis of the TF spaces with different characteristics (e.g., resolution). Our experimental results demonstrate that the combination of expansion to TF space, analysis using MRA, and extracting a set of suitable features and applying a proper predictive model is effective in enhancing the EEG artifact identification performance. We also compare the performance of the designed system with another common EEG signal processing technique-namely, 1D wavelet transform. Our experimental results reveal that the proposed method outperforms 1D wavelet.
Setting technical standards for visual assessment procedures
Kenneth H. Craik; Nickolaus R. Feimer
1979-01-01
Under the impetus of recent legislative and administrative mandates concerning analysis and management of the landscape, governmental agencies are being called upon to adopt or develop visual resource and impact assessment (VRIA) systems. A variety of techniques that combine methods of psychological assessment and landscape analysis to serve these purposes is being...
USDA-ARS?s Scientific Manuscript database
A multiresidue analytical method using a modification of the “quick, easy, cheap, effective, rugged, and safe” (QuEChERS) sample preparation approach combined with liquid chromatography–tandem mass spectrometry (LC-MS/MS) analysis was established and validated for the rapid determination of 69 pesti...
A rapid and sensitive method has been developed for the analysis of 48 human prescription active pharmaceutical ingredients (APIs) and 6 metabolites of interest, utilizing selective solid-phase extraction (SPE) and ultra performance liquid chromatography in combination with tripl...
Hiroyasu, Tomoyuki; Hayashinuma, Katsutoshi; Ichikawa, Hiroshi; Yagi, Nobuaki
2015-08-01
A preprocessing method for endoscopy image analysis using texture analysis is proposed. In a previous study, we proposed a feature value that combines a co-occurrence matrix and a run-length matrix to analyze the extent of early gastric cancer from images taken with narrow-band imaging endoscopy. However, the obtained feature value does not identify lesion zones correctly due to the influence of noise and halation. Therefore, we propose a new preprocessing method with a non-local means filter for de-noising and contrast limited adaptive histogram equalization. We have confirmed that the pattern of gastric mucosa in images can be improved by the proposed method. Furthermore, the lesion zone is shown more correctly by the obtained color map.
2011-01-01
Background Fluorescence in situ hybridization (FISH) is very accurate method for measuring HER2 gene copies, as a sign of potential breast cancer. This method requires small tissue samples, and has a high sensitivity to detect abnormalities from a histological section. By using multiple colors, this method allows the detection of multiple targets simultaneously. The target parts in the cells become visible as colored dots. The HER-2 probes are visible as orange stained spots under a fluorescent microscope while probes for centromere 17 (CEP-17), the chromosome on which the gene HER-2/neu is located, are visible as green spots. Methods The conventional analysis involves the scoring of the ratio of HER-2/neu over CEP 17 dots within each cell nucleus and then averaging the scores for a number of 60 cells. A ratio of 2.0 of HER-2/neu to CEP 17 copy number denotes amplification. Several methods have been proposed for the detection and automated evaluation (dot counting) of FISH signals. In this paper the combined method based on the mathematical morphology (MM) and inverse multifractal (IMF) analysis is suggested. Similar method was applied recently in detection of microcalcifications in digital mammograms, and was very successful. Results The combined MM using top-hat and bottom-hat filters, and the IMF method was applied to FISH images from Molecular Biology Lab, Department of Pathology, Wielkoposka Cancer Center, Poznan. Initial results indicate that this method can be applied to FISH images for the evaluation of HER2/neu status. Conclusions Mathematical morphology and multifractal approach are used for colored dot detection and counting in FISH images. Initial results derived on clinical cases are promising. Note that the overlapping of colored dots, particularly red/orange dots, needs additional improvements in post-processing. PMID:21489192
Grouping individual independent BOLD effects: a new way to ICA group analysis
NASA Astrophysics Data System (ADS)
Duann, Jeng-Ren; Jung, Tzyy-Ping; Sejnowski, Terrence J.; Makeig, Scott
2009-04-01
A new group analysis method to summarize the task-related BOLD responses based on independent component analysis (ICA) was presented. As opposite to the previously proposed group ICA (gICA) method, which first combined multi-subject fMRI data in either temporal or spatial domain and applied ICA decomposition only once to the combined fMRI data to extract the task-related BOLD effects, the method presented here applied ICA decomposition to the individual subjects' fMRI data to first find the independent BOLD effects specifically for each individual subject. Then, the task-related independent BOLD component was selected among the resulting independent components from the single-subject ICA decomposition and hence grouped across subjects to derive the group inference. In this new ICA group analysis (ICAga) method, one does not need to assume that the task-related BOLD time courses are identical across brain areas and subjects as used in the grand ICA decomposition on the spatially concatenated fMRI data. Neither does one need to assume that after spatial normalization, the voxels at the same coordinates represent exactly the same functional or structural brain anatomies across different subjects. These two assumptions have been problematic given the recent BOLD activation evidences. Further, since the independent BOLD effects were obtained from each individual subject, the ICAga method can better account for the individual differences in the task-related BOLD effects. Unlike the gICA approach whereby the task-related BOLD effects could only be accounted for by a single unified BOLD model across multiple subjects. As a result, the newly proposed method, ICAga, was able to better fit the task-related BOLD effects at individual level and thus allow grouping more appropriate multisubject BOLD effects in the group analysis.
NASA Astrophysics Data System (ADS)
Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel
2017-07-01
Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Background: Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. Materials and Methods: A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Results: Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Conclusions: Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills. PMID:29861761
Combining computer and manual overlays—Willamette River Greenway Study
Asa Hanamoto; Lucille Biesbroeck
1979-01-01
We will present a method of combining computer mapping with manual overlays. An example of its use is the Willamette River Greenway Study produced for the State of Oregon Department of Transportation in 1974. This one year planning study included analysis of data relevant to a 286-mile river system. The product is a "wise use" plan which conserves the basic...
ERIC Educational Resources Information Center
Kelly, Nick; Montenegro, Maximiliano; Gonzalez, Carlos; Clasing, Paula; Sandoval, Augusto; Jara, Magdalena; Saurina, Elvira; Alarcón, Rosa
2017-01-01
Purpose: The purpose of this paper is to demonstrate the utility of combining event-centred and variable-centred approaches when analysing big data for higher education institutions. It uses a large, university-wide data set to demonstrate the methodology for this analysis by using the case study method. It presents empirical findings about…
NASA Astrophysics Data System (ADS)
Zhang, Jinhua; Fang, Bin; Hong, Jun; Wan, Shaoke; Zhu, Yongsheng
2017-12-01
The combined angular contact ball bearings are widely used in automatic, aerospace and machine tools, but few researches on the combined angular contact ball bearings have been reported. It is shown that the preload and stiffness of combined bearings are mutual influenced rather than simply the superposition of multiple single bearing, therefore the characteristic calculation of combined bearings achieved by coupling the load and deformation analysis of a single bearing. In this paper, based on the Jones quasi-static model and stiffness analytical model, a new iterative algorithm and model are proposed for the calculation of combined bearings preload and stiffness, and the dynamic effects include centrifugal force and gyroscopic moment have to be considered. It is demonstrated that the new method has general applicability, the preload factors of combined bearings are calculated according to the different design preloads, and the static and dynamic stiffness for various arrangements of combined bearings are comparatively studied and analyzed, and the influences of the design preload magnitude, axial load and rotating speed are discussed in detail. Besides, the change rule of dynamic contact angles of combined bearings with respect to the rotating speed is also discussed. The results show that bearing arrangement modes, rotating speed and design preload magnitude have a significant influence on the preload and stiffness of combined bearings. The proposed formulation provides a useful tool in dynamic analysis of the complex bearing-rotor system.
[Discrimination of Rice Syrup Adulterant of Acacia Honey Based Using Near-Infrared Spectroscopy].
Zhang, Yan-nan; Chen, Lan-zhen; Xue, Xiao-feng; Wu, Li-ming; Li, Yi; Yang, Juan
2015-09-01
At present, the rice syrup as a low price of the sweeteners was often adulterated into acacia honey and the adulterated honeys were sold in honey markets, while there is no suitable and fast method to identify honey adulterated with rice syrup. In this study, Near infrared spectroscopy (NIR) combined with chemometric methods were used to discriminate authenticity of honey. 20 unprocessed acacia honey samples from the different honey producing areas, mixed? with different proportion of rice syrup, were prepared of seven different concentration gradient? including 121 samples. The near infrared spectrum (NIR) instrument and spectrum processing software have been applied in the? spectrum? scanning and data conversion on adulterant samples, respectively. Then it was analyzed by Principal component analysis (PCA) and canonical discriminant analysis methods in order to discriminating adulterated honey. The results showed that after principal components analysis, the first two principal components accounted for 97.23% of total variation, but the regionalism of the score plot of the first two PCs was not obvious, so the canonical discriminant analysis was used to make the further discrimination, all samples had been discriminated correctly, the first two discriminant functions accounted for 91.6% among the six canonical discriminant functions, Then the different concentration of adulterant samples can be discriminated correctly, it illustrate that canonical discriminant analysis method combined with NIR spectroscopy is not only feasible but also practical for rapid and effective discriminate of the rice syrup adulterant of acacia honey.
Coformer screening using thermal analysis based on binary phase diagrams.
Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Terada, Katsuhide
2014-08-01
The advent of cocrystals has demonstrated a growing need for efficient and comprehensive coformer screening in search of better development forms, including salt forms. Here, we investigated a coformer screening system for salts and cocrystals based on binary phase diagrams using thermal analysis and examined the effectiveness of the method. Indomethacin and tenoxicam were used as models of active pharmaceutical ingredients (APIs). Physical mixtures of an API and 42 kinds of coformers were analyzed using Differential Scanning Calorimetry (DSC) and X-ray DSC. We also conducted coformer screening using a conventional slurry method and compared these results with those from the thermal analysis method and previous studies. Compared with the slurry method, the thermal analysis method was a high-performance screening system, particularly for APIs with low solubility and/or propensity to form solvates. However, this method faced hurdles for screening coformers combined with an API in the presence of kinetic hindrance for salt or cocrystal formation during heating or if there is degradation near the metastable eutectic temperature. The thermal analysis and slurry methods are considered complementary to each other for coformer screening. Feasibility of the thermal analysis method in drug discovery practice is ensured given its small scale and high throughput.
Hayashi, Yukako; Ohara, Kazuaki; Taki, Rika; Saeki, Tomomi; Yamaguchi, Kentaro
2018-03-12
The crystalline sponge (CS) method, which employs single-crystal X-ray diffraction to determine the structure of an analyte present as a liquid or an oil and having a low melting point, was used in combination with laser desorption ionization mass spectrometry (LDI-MS). 1,3-Benzodioxole derivatives were encapsulated in CS and their structures were determined by combining X-ray crystallography and MS. After the X-ray analysis, the CS was subjected to imaging mass spectrometry (IMS) with an LDI spiral-time-of-flight mass spectrometer (TOF-MS). The ion detection area matched the microscopic image of the encapsulated CS. In addition, the accumulated 1D mass spectra showed that fragmentation of the guest molecule (hereafter, guest) can be easily visualized without any interference from the fragment ions of CS except for two strong ion peaks derived from the tridentate ligand TPT (2,4,6-tris(4-pyridyl)-1,3,5-triazine) of the CS and its fragment. X-ray analysis clearly showed the presence of the guest as well as the π-π, CH-halogen, and CH-O interactions between the guest and the CS framework. However, some guests remained randomly diffused in the nanopores of CS. In addition, the detection limit was less than sub-pmol order based on the weight and density of CS determined by X-ray analysis. Spectroscopic data, such as UV-vis and NMR, also supported the encapsulation of the guest through the interaction between the guest and CS components. The results denote that the CS-LDI-MS method, which combines CS, X-ray analysis and LDI-MS, is effective for structure determination.
Parish, Chad M.; Miller, Michael K.
2014-12-09
Nanostructured ferritic alloys (NFAs) exhibit complex microstructures consisting of 100-500 nm ferrite grains, grain boundary solute enrichment, and multiple populations of precipitates and nanoclusters (NCs). Understanding these materials' excellent creep and radiation-tolerance properties requires a combination of multiple atomic-scale experimental techniques. Recent advances in scanning transmission electron microscopy (STEM) hardware and data analysis methods have the potential to revolutionize nanometer to micrometer scale materials analysis. The application of these methods is applied to NFAs as a test case and is compared to both conventional STEM methods as well as complementary methods such as scanning electron microscopy and atom probe tomography.more » In this paper, we review past results and present new results illustrating the effectiveness of latest-generation STEM instrumentation and data analysis.« less
Jeux, François; Desfarges-Berthelemot, Agnès; Kermène, Vincent; Barthelemy, Alain
2012-12-17
We report experiments on a new laser architecture involving phase contrast filtering to coherently combine an array of fiber lasers. We demonstrate that the new technique yields a more stable phase-locking than standard methods using only amplitude filtering. A spectral analysis of the output beams shows that the new scheme generates more resonant frequencies common to the coupled lasers. This property can enhance the combining efficiency when the number of lasers to be coupled is large.
Focused Ion Beam Methods for Research and Control of HEMT Fabrication
NASA Astrophysics Data System (ADS)
Pevtsov, E. Ph; Bespalov, A. V.; Demenkova, T. A.; Luchnikov, P. A.
2017-04-01
The combination of ion-beam spraying and raster electronic microscopy allows to receive images of sections of defects of the growth nature origin in epitaxial films on GaN basis with nanodimensional permission, to carry out their analysis and classification irrespective of conditions of receiving. Results of application of the specified methods for the analysis of technological operations when forming the microwave transistors are considered: formations of locks, receiving of holes and drawing of contacts.
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Maione, Camila; Barbosa, Rommel Melgaço
2018-01-24
Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.
Vázquez-Rowe, Ian; Iribarren, Diego
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.
Vázquez-Rowe, Ian
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136
Morgenstern, Hai; Rafaely, Boaz
2018-02-01
Spatial analysis of room acoustics is an ongoing research topic. Microphone arrays have been employed for spatial analyses with an important objective being the estimation of the direction-of-arrival (DOA) of direct sound and early room reflections using room impulse responses (RIRs). An optimal method for DOA estimation is the multiple signal classification algorithm. When RIRs are considered, this method typically fails due to the correlation of room reflections, which leads to rank deficiency of the cross-spectrum matrix. Preprocessing methods for rank restoration, which may involve averaging over frequency, for example, have been proposed exclusively for spherical arrays. However, these methods fail in the case of reflections with equal time delays, which may arise in practice and could be of interest. In this paper, a method is proposed for systems that combine a spherical microphone array and a spherical loudspeaker array, referred to as multiple-input multiple-output systems. This method, referred to as modal smoothing, exploits the additional spatial diversity for rank restoration and succeeds where previous methods fail, as demonstrated in a simulation study. Finally, combining modal smoothing with a preprocessing method is proposed in order to increase the number of DOAs that can be estimated using low-order spherical loudspeaker arrays.
Liu, Yingchun; Liu, Zhongbo; Sun, Guoxiang; Wang, Yan; Ling, Junhong; Gao, Jiayue; Huang, Jiahao
2015-01-01
A combination method of multi-wavelength fingerprinting and multi-component quantification by high performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was developed and validated to monitor and evaluate the quality consistency of herbal medicines (HM) in the classical preparation Compound Bismuth Aluminate tablets (CBAT). The validation results demonstrated that our method met the requirements of fingerprint analysis and quantification analysis with suitable linearity, precision, accuracy, limits of detection (LOD) and limits of quantification (LOQ). In the fingerprint assessments, rather than using conventional qualitative "Similarity" as a criterion, the simple quantified ratio fingerprint method (SQRFM) was recommended, which has an important quantified fingerprint advantage over the "Similarity" approach. SQRFM qualitatively and quantitatively offers the scientific criteria for traditional Chinese medicines (TCM)/HM quality pyramid and warning gate in terms of three parameters. In order to combine the comprehensive characterization of multi-wavelength fingerprints, an integrated fingerprint assessment strategy based on information entropy was set up involving a super-information characteristic digitized parameter of fingerprints, which reveals the total entropy value and absolute information amount about the fingerprints and, thus, offers an excellent method for fingerprint integration. The correlation results between quantified fingerprints and quantitative determination of 5 marker compounds, including glycyrrhizic acid (GLY), liquiritin (LQ), isoliquiritigenin (ILG), isoliquiritin (ILQ) and isoliquiritin apioside (ILA), indicated that multi-component quantification could be replaced by quantified fingerprints. The Fenton reaction was employed to determine the antioxidant activities of CBAT samples in vitro, and they were correlated with HPLC fingerprint components using the partial least squares regression (PLSR) method. In summary, the method of multi-wavelength fingerprints combined with antioxidant activities has been proved to be a feasible and scientific procedure for monitoring and evaluating the quality consistency of CBAT.
Dual ant colony operational modal analysis parameter estimation method
NASA Astrophysics Data System (ADS)
Sitarz, Piotr; Powałka, Bartosz
2018-01-01
Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.
An analysis of hypercritical states in elastic and inelastic systems
NASA Astrophysics Data System (ADS)
Kowalczk, Maciej
The author raises a wide range of problems whose common characteristic is an analysis of hypercritical states in elastic and inelastic systems. the article consists of two basic parts. The first part primarily discusses problems of modelling hypercritical states, while the second analyzes numerical methods (so-called continuation methods) used to solve non-linear problems. The original approaches for modelling hypercritical states found in this article include the combination of plasticity theory and an energy condition for cracking, accounting for the variability and cyclical nature of the forms of fracture of a brittle material under a die, and the combination of plasticity theory and a simplified description of the phenomenon of localization along a discontinuity line. The author presents analytical solutions of three non-linear problems for systems made of elastic/brittle/plastic and elastic/ideally plastic materials. The author proceeds to discuss the analytical basics of continuation methods and analyzes the significance of the parameterization of non-linear problems, provides a method for selecting control parameters based on an analysis of the rank of a rectangular matrix of a uniform system of increment equations, and also provides a new method for selecting an equilibrium path originating from a bifurcation point. The author provides a general outline of continuation methods based on an analysis of the rank of a matrix of a corrective system of equations. The author supplements his theoretical solutions with numerical solutions of non-linear problems for rod systems and problems of the plastic disintegration of a notched rectangular plastic plate.
Takahashi, Hiro; Nemoto, Takeshi; Yoshida, Teruhiko; Honda, Hiroyuki; Hasegawa, Tadashi
2006-01-01
Background Recent advances in genome technologies have provided an excellent opportunity to determine the complete biological characteristics of neoplastic tissues, resulting in improved diagnosis and selection of treatment. To accomplish this objective, it is important to establish a sophisticated algorithm that can deal with large quantities of data such as gene expression profiles obtained by DNA microarray analysis. Results Previously, we developed the projective adaptive resonance theory (PART) filtering method as a gene filtering method. This is one of the clustering methods that can select specific genes for each subtype. In this study, we applied the PART filtering method to analyze microarray data that were obtained from soft tissue sarcoma (STS) patients for the extraction of subtype-specific genes. The performance of the filtering method was evaluated by comparison with other widely used methods, such as signal-to-noise, significance analysis of microarrays, and nearest shrunken centroids. In addition, various combinations of filtering and modeling methods were used to extract essential subtype-specific genes. The combination of the PART filtering method and boosting – the PART-BFCS method – showed the highest accuracy. Seven genes among the 15 genes that are frequently selected by this method – MIF, CYFIP2, HSPCB, TIMP3, LDHA, ABR, and RGS3 – are known prognostic marker genes for other tumors. These genes are candidate marker genes for the diagnosis of STS. Correlation analysis was performed to extract marker genes that were not selected by PART-BFCS. Sixteen genes among those extracted are also known prognostic marker genes for other tumors, and they could be candidate marker genes for the diagnosis of STS. Conclusion The procedure that consisted of two steps, such as the PART-BFCS and the correlation analysis, was proposed. The results suggest that novel diagnostic and therapeutic targets for STS can be extracted by a procedure that includes the PART filtering method. PMID:16948864
NASA Astrophysics Data System (ADS)
Rohmah, D. N.; Saputro, S.; Masykuri, M.; Mahardiani, L.
2018-03-01
The purpose of this research was to know the effect and determine the mass comparation which most effective combination between rice husk and coconut shell activated adsorbent to adsorb Pb (II) ion using SPS method. This research used experimental method. Technique to collecting this datas of this research is carried out by several stages, which are: (1) carbonization of rice husk and coconut shell adsorbent using muffle furnace at a temperature of 350°C for an hour; (2) activation of the rice husk and coconut shell adsorbent using NaOH 1N and ZnCl2 15% activator; (3) contacting the adsorbent of rice husk and coconut shell activated adsorbent with liquid waste simulation of Pb(II) using variation comparison of rice husk and coconut shell, 1:0; 0:1; 1:1; 2:1; 1:2; (4) analysis of Pb(II) using Solid-Phase Spectrophotometry (SPS); (5) characterization of combination rice husk and coconut shell activated adsorbent using FTIR. The result of this research show that the combined effect of combination rice husk and coconut shell activated adsorbent can increase the ability of the adsorbent to absorb Pb(II) ion then the optimum adsorbent mass ratio required for absorbing 20 mL of Pb(II) ion with a concentration of 49.99 µg/L is a ratio of 2:1 with the absorption level of 97,06%Solid-Phase Spectrophotometry (SPS) is an effective method in the level of µg/L, be marked with the Limit of Detection (LOD) of 0.03 µg/L.
Linge, Annett; Schötz, Ulrike; Löck, Steffen; Lohaus, Fabian; von Neubeck, Cläre; Gudziol, Volker; Nowak, Alexander; Tinhofer, Inge; Budach, Volker; Sak, Ali; Stuschke, Martin; Balermpas, Panagiotis; Rödel, Claus; Bunea, Hatice; Grosu, Anca-Ligia; Abdollahi, Amir; Debus, Jürgen; Ganswindt, Ute; Lauber, Kirsten; Pigorsch, Steffi; Combs, Stephanie E; Mönnich, David; Zips, Daniel; Baretton, Gustavo B; Buchholz, Frank; Krause, Mechthild; Belka, Claus; Baumann, Michael
2018-04-01
To compare six HPV detection methods in pre-treatment FFPE tumour samples from patients with locally advanced head and neck squamous cell carcinoma (HNSCC) who received postoperative (N = 175) or primary (N = 90) radiochemotherapy. HPV analyses included detection of (i) HPV16 E6/E7 RNA, (ii) HPV16 DNA (PCR-based arrays, A-PCR), (iii) HPV DNA (GP5+/GP6+ qPCR, (GP-PCR)), (iv) p16 (immunohistochemistry, p16 IHC), (v) combining p16 IHC and the A-PCR result and (vi) combining p16 IHC and the GP-PCR result. Differences between HPV positive and negative subgroups were evaluated for the primary endpoint loco-regional control (LRC) using Cox regression. Correlation between the HPV detection methods was high (chi-squared test, p < 0.001). While p16 IHC analysis resulted in several false positive classifications, A-PCR, GP-PCR and the combination of p16 IHC and A-PCR or GP-PCR led to results comparable to RNA analysis. In both cohorts, Cox regression analyses revealed significantly prolonged LRC for patients with HPV positive tumours irrespective of the detection method. The most stringent classification was obtained by detection of HPV16 RNA, or combining p16 IHC with A-PCR or GP-PCR. This approach revealed the lowest rate of recurrence in patients with tumours classified as HPV positive and therefore appears most suited for patient stratification in HPV-based clinical studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Project delay analysis of HRSG
NASA Astrophysics Data System (ADS)
Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo
2017-08-01
Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Artificial intelligence: A social spin on language analysis
NASA Astrophysics Data System (ADS)
Rosé, Carolyn Penstein
2017-05-01
Understanding the prevalence and impact of personal attacks in online discussions is challenging. A method that combines crowdsourcing and machine learning provides a way forward, but caveats must be considered.
NASA Astrophysics Data System (ADS)
Saputro, S.; Mahardiani, L.; Wulandari, D. A.
2018-03-01
This research aimed to know the usage of sawdust of teak wood and rice husk waste as Pb (II) ion adsorbents in simulated liquid waste, the combined optimum mass required adsorbent to adsorb Pb(II) ion, the sensitivity of the solid-phase spectrophotometry (sps) method in determining the decrease of Pb (II) metal ion levels in the μg/L level. This research was conducted by experimental method in laboratory. Adsorbents used in this study were charcoal of sawdust sawdust activated using 15% ZnCl2 solution and activated rice husk using 2 N NaOH solution. The adsorption processes of sawdust and rice husk with Pb(II) solution was done by variation of mass combination with a ratio of 1: 0; 0: 1; 1: 1; 1: 2; and 2: 1. Analysis of Pb(II) ion concentration using SPS and characterization of sawdust and rice husk adsorbent ads using FTIR. The results showed that activated charcoal from sawdust of teak wood and rice husks can be used as Pb (II) metal ion adsorbents with adsorption capacity of 0.86 μg/L, charcoal from sawdust of teak wood and rice husk adsorbent with a combination of optimum mass contact of sawdust and rice husk is 2:1 as much as 3 grams can adsorb 42.80 μg/L. Solid-phase spectophotometry is a sensitive method for analysis of concentration decreasing levels of Pb(II) ion, after it was absorbed by sawdust of teak wood and rice husk with high sensitivity and has the limit of detection (LOD) of 0.06 μg/L.
Chen, Mao-Gen; Wang, Xiao-Ping; Ju, Wei-Qiang; Zhao, Qiang; Wu, Lin-Wei; Ren, Qing-Qi; Guo, Zhi-Yong; Wang, Dong-Ping; Zhu, Xiao-Feng; Ma, Yi; He, Xiao-Shun
2017-01-01
Objectives Elevated plasma fibrinogen (Fib) correlated with patient's prognosis in several solid tumors. However, few studies have illuminated the relationship between preoperative Fib and prognosis of HCC after liver transplantation. We aimed to clarify the prognostic value of Fib and whether the prognostic accuracy can be enhanced by the combination of Fib and neutrophil–lymphocyte ratio (NLR). Results Fib was correlated with Child-pugh stage, alpha-fetoprotein (AFP), size of largest tumor, macro- and micro-vascular invasion. Univariate analysis showed preoperative Fib, AFP, NLR, size of largest tumor, tumor number, macro- and micro- vascular invasion were significantly associated with disease-free survival (DFS) and overall survival (OS) in HCC patients with liver transplantation. After multivariate analysis, only Fib and macro-vascular invasion were independently correlated with DFS and OS. Survival analysis showed that preoperative Fib > 2.345 g/L predicted poor prognosis of patients HCC after liver transplantation. Preoperative Fib showed prognostic value in various subgroups of HCC. Furthermore, the predictive range was expanded by the combination of Fib and NLR. Materials and Methods Data were collected retrospectively from 130 HCC patients who underwent liver transplantation. Preoperative Fib, NLR and clinicopathologic variables were analyzed. The survival analysis was performed by the Kaplan-Meier method, and compared by the log-rank test. Univariate and multivariate analyses were performed to identify the prognostic factors for DFS and OS. Conclusions Preoperative Fib is an independent effective predictor of prognosis for HCC patients, higher levels of Fib predict poorer outcomes and the combination of Fib and NLR enlarges the prognostic accuracy of testing. PMID:27935864
Dickman, Andrew; Bickerstaff, Matthew; Jackson, Richard; Schneider, Jennifer; Mason, Stephen; Ellershaw, John
2017-03-23
A continuous subcutaneous infusion (CSCI) delivered via syringe pump is a method of drug administration used to maintain symptom control when a patient is no longer able to tolerate oral medication. Several classes of drugs, such as opioids, antiemetics, anticholinergics, antipsychotics and benzodiazepines are routinely administered by CSCI alone or in combinations. Previous studies attempting to identify the most-common CSCI combinations are now several years old and no longer reflect current clinical practice. The aim of this work was to review current clinical practice and identify CSCI drug combinations requiring analysis for chemical compatibility and stability. UK pharmacy professionals involved in the delivery of care to palliative patients in hospitals and hospices were invited to enter CSCI combinations comprised of two or more drugs onto an electronic database over a 12-month period. In addition, a separate Delphi study with a panel of 15 expert healthcare professionals was completed to identify a maximum of five combinations of drugs used to treat more complex, but less commonly encountered symptoms unlikely to be identified by the national survey. A total of 57 individuals representing 33 separate palliative care services entered 1,945 drug combinations suitable for analysis, with 278 discrete combinations identified. The top 40 drug combinations represented nearly two-thirds of combinations recorded. A total of 23 different drugs were administered in combination and the median number of drugs in a combination was three. The Delphi study identified five combinations for the relief of complex or refractory symptoms. This study represents the first step towards developing authoritative national guidance on the administration of drugs by CSCI. Further work will ensure healthcare practitioners have the knowledge and confidence that a prescribed combination will be both safe and efficacious.
NASA Astrophysics Data System (ADS)
Kaloop, Mosbeh R.; Yigit, Cemal O.; Hu, Jong W.
2018-03-01
Recently, the high rate global navigation satellite system-precise point positioning (GNSS-PPP) technique has been used to detect the dynamic behavior of structures. This study aimed to increase the accuracy of the extraction oscillation properties of structural movements based on the high-rate (10 Hz) GNSS-PPP monitoring technique. A developmental model based on the combination of wavelet package transformation (WPT) de-noising and neural network prediction (NN) was proposed to improve the dynamic behavior of structures for GNSS-PPP method. A complicated numerical simulation involving highly noisy data and 13 experimental cases with different loads were utilized to confirm the efficiency of the proposed model design and the monitoring technique in detecting the dynamic behavior of structures. The results revealed that, when combined with the proposed model, GNSS-PPP method can be used to accurately detect the dynamic behavior of engineering structures as an alternative to relative GNSS method.
Clinical perspective of cell-free DNA testing for fetal aneuploidies.
Gratacós, Eduard; Nicolaides, Kypros
2014-01-01
Cell-free DNA testing in maternal blood provides the most effective method of screening for trisomy 21, with a reported detection rate of 99% and a false positive rate of less than 0.1%. After many years of research, this method is now commercially available and is carried out in an increasing number of patients, and there is an expanding number of conditions that can be screened for. However, the application of these methods in clinical practice requires a careful analysis. Current first-trimester screening strategies are based on a complex combination of tests, aiming at detecting fetal defects and predicting the risk of main pregnancy complications. It is therefore necessary to define the optimal way of combining cell-free DNA testing with current first-trimester screening methods. In this concise review we describe the basis of cell-free DNA testing and discuss the potential approaches for its implementation in combination with current tests in the first trimester. © 2014 S. Karger AG, Basel.
Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent
2012-07-01
In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in 96-well checkerboard plates. The results were obtained separately using the four different interpretation methods frequently preferred by researchers. Thus, it was aimed to detect to what extent the rates of synergistic, indifferent and antagonistic interactions were affected by different interpretation methods. The differences between the interpretation methods were tested by chi-square analysis for each combination used. Statistically significant differences were detected between the four different interpretation methods for the determination of synergistic and indifferent interactions (p< 0.0001). Highest rates of synergy were observed with both combinations by the method that used the lowest fractional inhibitory concentration index of all the non-turbid wells along the turbidity/non-turbidity interface. There was no statistically significant difference between the four methods for the detection of antagonism (p> 0.05). In conclusion although there is a standard procedure for checkerboard synergy testing it fails to exhibit standard results owing to different methods of interpretation of the results. Thus, there is a need to standardise the interpretation method for checkerboard synergy testing. To determine the most appropriate method of interpretation further studies investigating the clinical benefits of synergic combinations and additionally comparing the consistency of the results obtained from the other standard combination tests like time-kill studies, are required.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills.
Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061
Selecting single model in combination forecasting based on cointegration test and encompassing test.
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.
Modified GMDH-NN algorithm and its application for global sensitivity analysis
NASA Astrophysics Data System (ADS)
Song, Shufang; Wang, Lu
2017-11-01
Global sensitivity analysis (GSA) is a very useful tool to evaluate the influence of input variables in the whole distribution range. Sobol' method is the most commonly used among variance-based methods, which are efficient and popular GSA techniques. High dimensional model representation (HDMR) is a popular way to compute Sobol' indices, however, its drawbacks cannot be ignored. We show that modified GMDH-NN algorithm can calculate coefficients of metamodel efficiently, so this paper aims at combining it with HDMR and proposes GMDH-HDMR method. The new method shows higher precision and faster convergent rate. Several numerical and engineering examples are used to confirm its advantages.
Real-Time PCR in Clinical Microbiology: Applications for Routine Laboratory Testing
Espy, M. J.; Uhl, J. R.; Sloan, L. M.; Buckwalter, S. P.; Jones, M. F.; Vetter, E. A.; Yao, J. D. C.; Wengenack, N. L.; Rosenblatt, J. E.; Cockerill, F. R.; Smith, T. F.
2006-01-01
Real-time PCR has revolutionized the way clinical microbiology laboratories diagnose many human microbial infections. This testing method combines PCR chemistry with fluorescent probe detection of amplified product in the same reaction vessel. In general, both PCR and amplified product detection are completed in an hour or less, which is considerably faster than conventional PCR detection methods. Real-time PCR assays provide sensitivity and specificity equivalent to that of conventional PCR combined with Southern blot analysis, and since amplification and detection steps are performed in the same closed vessel, the risk of releasing amplified nucleic acids into the environment is negligible. The combination of excellent sensitivity and specificity, low contamination risk, and speed has made real-time PCR technology an appealing alternative to culture- or immunoassay-based testing methods for diagnosing many infectious diseases. This review focuses on the application of real-time PCR in the clinical microbiology laboratory. PMID:16418529
NASA Astrophysics Data System (ADS)
Zou, Shuzhen; Chen, Han; Yu, Haijuan; Sun, Jing; Zhao, Pengfei; Lin, Xuechun
2017-12-01
We demonstrate a new method for fabricating a (6 + 1) × 1 pump-signal combiner based on the reduction of signal fiber diameter by corrosion. This method avoids the mismatch loss of the splice between the signal fiber and the output fiber caused by the signal fiber taper processing. The optimum radius of the corroded signal fiber was calculated according to the analysis of the influence of the cladding thickness on the laser propagating in the fiber core. Besides, we also developed a two-step splicing method to complete the high-precision alignment between the signal fiber core and the output fiber core. A high-efficiency (6 + 1) × 1 pump-signal combiner was produced with an average pump power transmission efficiency of 98.0% and a signal power transmission efficiency of 97.7%, which is well suitable for application to high-power fiber laser system.
Liu, Bin; Wang, Xiaolong; Lin, Lei; Dong, Qiwen; Wang, Xuan
2008-12-01
Protein remote homology detection and fold recognition are central problems in bioinformatics. Currently, discriminative methods based on support vector machine (SVM) are the most effective and accurate methods for solving these problems. A key step to improve the performance of the SVM-based methods is to find a suitable representation of protein sequences. In this paper, a novel building block of proteins called Top-n-grams is presented, which contains the evolutionary information extracted from the protein sequence frequency profiles. The protein sequence frequency profiles are calculated from the multiple sequence alignments outputted by PSI-BLAST and converted into Top-n-grams. The protein sequences are transformed into fixed-dimension feature vectors by the occurrence times of each Top-n-gram. The training vectors are evaluated by SVM to train classifiers which are then used to classify the test protein sequences. We demonstrate that the prediction performance of remote homology detection and fold recognition can be improved by combining Top-n-grams and latent semantic analysis (LSA), which is an efficient feature extraction technique from natural language processing. When tested on superfamily and fold benchmarks, the method combining Top-n-grams and LSA gives significantly better results compared to related methods. The method based on Top-n-grams significantly outperforms the methods based on many other building blocks including N-grams, patterns, motifs and binary profiles. Therefore, Top-n-gram is a good building block of the protein sequences and can be widely used in many tasks of the computational biology, such as the sequence alignment, the prediction of domain boundary, the designation of knowledge-based potentials and the prediction of protein binding sites.
Prediction Analysis for Measles Epidemics
NASA Astrophysics Data System (ADS)
Sumi, Ayako; Ohtomo, Norio; Tanaka, Yukio; Sawamura, Sadashi; Olsen, Lars Folke; Kobayashi, Nobumichi
2003-12-01
A newly devised procedure of prediction analysis, which is a linearized version of the nonlinear least squares method combined with the maximum entropy spectral analysis method, was proposed. This method was applied to time series data of measles case notification in several communities in the UK, USA and Denmark. The dominant spectral lines observed in each power spectral density (PSD) can be safely assigned as fundamental periods. The optimum least squares fitting (LSF) curve calculated using these fundamental periods can essentially reproduce the underlying variation of the measles data. An extension of the LSF curve can be used to predict measles case notification quantitatively. Some discussions including a predictability of chaotic time series are presented.
NASA Astrophysics Data System (ADS)
Ivanov, Konstantin; Pinchuk, Irina; Gorodnichev, Roman; Polyanskaya, Lubov
2016-04-01
Methods establishment of soil microbial cells size estimation called from the importance of current needs of research in microbial ecology. Some of the methods need to be improved for more detailed view of changes happen in microbiome of terrestrial ecosystems. The combination of traditional microscopy methods, fluorescence and filtration in addition to cutting-edge DNA analysis gives a wide range of the approaches for soil microbial ecologists in their research questions. In the most of the cases the bacterial cells size is limited of the natural conditions such as lack of nutrients or stress factors due to heterogeneity of soil system. In the samples of soils, lakes and rivers sediments, snow and rain water the bacterial cells were detected minimally of 0.2 microns. We established the combination of the cascade filtration and fluorescent microscopy for complex analysis of different terrestrial ecosystems and various soil types. Our modification based on the use of successively filtered soil suspension for collection of microbes by the membrane pores decrease. Combination with fluorescence microscopy and DNA analysis via FISH method gave the presentation of microbial interactions and review of ecological strategies of soil microorganisms. Humus horizons of primitive arctic soil were the most favorable for bacterial growth. Quantified biomass of soil bacteria depends on the dominance of cells with specific dimensions caused of stress factors. The average bacterial size of different soil varied from 0.23 to 0.38 microns, however in humus horizons of arctic soil we detected the contrast dominance of the bigger bacterial cells sized of 1.85 microns. Fungi in this case contributed to increase the availability of organic matter for bacteria because the fungal mycelium forms the appreciable part of microbial biomass of primitive arctic soil. The dominant content of bigger bacterial cells in forest and fallow soil as well as the opposite situation in arable soils caused by the availability of nutrients (glucose) and the degree of agricultural anthropogenic stress. Various combinations of factors such as stressful conditions (anaerobiosis, acidity and temperature) influenced on bacterial size. The decrease of these stress factors resulted in return to the original bacterial cell size in soil. Furthermore the modification of gram-negative bacteria quantification was performed and combined with FISH method and DNA extraction. We established the methodological comparison of gram-negative bacteria groups in aerobic and anaerobic conditions. Due to absence of significant difference between the most frequent soil gram-negative bacteria groups we concluded the important ecological role of gram-negative bacteria as common group of microorganisms in natural polymer degradation. Depending on nutrient (glucose, cellulose, chitin) gram-negative bacteria competed with actinomyces for available nutrients at the different time, what explained by the ecological flexibility of this soil bacteria group. The experiments showed expressed faster chitinolytic activity of soil gram-negative bacteria compare to actinomyces. Thus our approaches to use the combination both traditional and cutting-edge methods, forms the unique basement for various research and mostly open the wide doors to design new scientific experiments in ecology of terrestrial ecosystems and especially in soil microbial ecology.
Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A
2000-02-11
The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.
Analyzing Planck and low redshift data sets with advanced statistical methods
NASA Astrophysics Data System (ADS)
Eifler, Tim
The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi-probe analysis proposed here we will use the existing CosmoLike software, a computationally efficient analysis framework that is unique in its integrated ansatz of jointly analyzing probes of large-scale structure (LSS) of the Universe. We plan to combine CosmoLike with publicly available CMB analysis software (Camb, CLASS) to include modeling capabilities of CMB temperature, polarization, and lensing measurements. The resulting analysis framework will be capable to independently and jointly analyze data from the CMB and from various probes of the LSS of the Universe. After completion we will utilize this framework to check for consistency amongst the individual probes and subsequently run a joint likelihood analysis of probes that are not in tension. The inclusion of Planck information in a joint likelihood analysis substantially reduces DES uncertainties in cosmological parameters, and allows for unprecedented constraints on parameters that describe astrophysics. In their recent review Observational Probes of Cosmic Acceleration (Weinberg et al 2013) the authors emphasize the value of a balanced program that employs several of the most powerful methods in combination, both to cross-check systematic uncertainties and to take advantage of complementary information. The work we propose follows exactly this idea: 1) cross-checking existing Planck results with alternative methods in the data analysis, 2) checking for consistency of Planck and DES data, and 3) running a joint analysis to constrain cosmology and astrophysics. It is now expedient to develop and refine multi-probe analysis strategies that allow the comparison and inclusion of information from disparate probes to optimally obtain cosmology and astrophysics. Analyzing Planck and DES data poses an ideal opportunity for this purpose and corresponding lessons will be of great value for the science preparation of Euclid and WFIRST.
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Leung, Martin S. K.
1995-01-01
The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.
Erich, Sarah; Schill, Sandra; Annweiler, Eva; Waiblinger, Hans-Ulrich; Kuballa, Thomas; Lachenmeier, Dirk W; Monakhova, Yulia B
2015-12-01
The increased sales of organically produced food create a strong need for analytical methods, which could authenticate organic and conventional products. Combined chemometric analysis of (1)H NMR-, (13)C NMR-spectroscopy data, stable-isotope data (IRMS) and α-linolenic acid content (gas chromatography) was used to differentiate organic and conventional milk. In total 85 raw, pasteurized and ultra-heat treated (UHT) milk samples (52 organic and 33 conventional) were collected between August 2013 and May 2014. The carbon isotope ratios of milk protein and milk fat as well as the α-linolenic acid content of these samples were determined. Additionally, the milk fat was analyzed by (1)H and (13)C NMR spectroscopy. The chemometric analysis of combined data (IRMS, GC, NMR) resulted in more precise authentication of German raw and retail milk with a considerably increased classification rate of 95% compared to 81% for NMR and 90% for IRMS using linear discriminate analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J; Grau, Raúl; Barat, José M
2016-10-19
A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0-6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead's pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R² of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness.
Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J.; Grau, Raúl; Barat, José M.
2016-01-01
A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0–6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead’s pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R2 of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness. PMID:27775556
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
NASA Astrophysics Data System (ADS)
Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.
2017-09-01
In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.
Multimodal biometric method that combines veins, prints, and shape of a finger
NASA Astrophysics Data System (ADS)
Kang, Byung Jun; Park, Kang Ryoung; Yoo, Jang-Hee; Kim, Jeong Nyeo
2011-01-01
Multimodal biometrics provides high recognition accuracy and population coverage by using various biometric features. A single finger contains finger veins, fingerprints, and finger geometry features; by using multimodal biometrics, information on these multiple features can be simultaneously obtained in a short time and their fusion can outperform the use of a single feature. This paper proposes a new finger recognition method based on the score-level fusion of finger veins, fingerprints, and finger geometry features. This research is novel in the following four ways. First, the performances of the finger-vein and fingerprint recognition are improved by using a method based on a local derivative pattern. Second, the accuracy of the finger geometry recognition is greatly increased by combining a Fourier descriptor with principal component analysis. Third, a fuzzy score normalization method is introduced; its performance is better than the conventional Z-score normalization method. Fourth, finger-vein, fingerprint, and finger geometry recognitions are combined by using three support vector machines and a weighted SUM rule. Experimental results showed that the equal error rate of the proposed method was 0.254%, which was lower than those of the other methods.
NASA Astrophysics Data System (ADS)
Nikitaev, V. G.; Pronichev, A. N.; Polyakov, E. V.; Zaharenko, Yu V.
2018-01-01
The paper considers the problem of leukocytes segmentation in microscopic images of bone marrow smears for automated diagnosis of the blood system diseases. The method was proposed to solve the problem of segmentation of contacting leukocytes in images of bone marrow smears. The method is based on the analysis of structure of objects of a separation and distances filter in combination with the watershed method and distance transformation method.
Laser-based methods for the analysis of low molecular weight compounds in biological matrices.
Kiss, András; Hopfgartner, Gérard
2016-07-15
Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
van den Berg, Irene; Boichard, Didier; Lund, Mogens Sandø
2016-11-01
The objective of this study was to compare mapping precision and power of within-breed and multibreed genome-wide association studies (GWAS) and to compare the results obtained by the multibreed GWAS with 3 meta-analysis methods. The multibreed GWAS was expected to improve mapping precision compared with a within-breed GWAS because linkage disequilibrium is conserved over shorter distances across breeds than within breeds. The multibreed GWAS was also expected to increase detection power for quantitative trait loci (QTL) segregating across breeds. GWAS were performed for production traits in dairy cattle, using imputed full genome sequences of 16,031 bulls, originating from 6 French and Danish dairy cattle populations. Our results show that a multibreed GWAS can be a valuable tool for the detection and fine mapping of quantitative trait loci. The number of QTL detected with the multibreed GWAS was larger than the number detected by the within-breed GWAS, indicating an increase in power, especially when the 2 Holstein populations were combined. The largest number of QTL was detected when all populations were combined. The analysis combining all breeds was, however, dominated by Holstein, and QTL segregating in other breeds but not in Holstein were sometimes overshadowed by larger QTL segregating in Holstein. Therefore, the GWAS combining all breeds except Holstein was useful to detect such peaks. Combining all breeds except Holstein resulted in smaller QTL intervals on average, but this outcome was not the case when the Holstein populations were included in the analysis. Although no decrease in the average QTL size was observed, mapping precision did improve for several QTL. Out of 3 different multibreed meta-analysis methods, the weighted z-scores model resulted in the most similar results to the full multibreed GWAS and can be useful as an alternative to a full multibreed GWAS. Differences between the multibreed GWAS and the meta-analyses were larger when different breeds were combined than when the 2 Holstein populations were combined. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering
NASA Astrophysics Data System (ADS)
Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech
2015-03-01
We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.
Ozyurt, A Sinem; Selby, Thomas L
2008-07-01
This study describes a method to computationally assess the function of homologous enzymes through small molecule binding interaction energy. Three experimentally determined X-ray structures and four enzyme models from ornithine cyclo-deaminase, alanine dehydrogenase, and mu-crystallin were used in combination with nine small molecules to derive a function score (FS) for each enzyme-model combination. While energy values varied for a single molecule-enzyme combination due to differences in the active sites, we observe that the binding energies for the entire pathway were proportional for each set of small molecules investigated. This proportionality of energies for a reaction pathway appears to be dependent on the amino acids in the active site and their direct interactions with the small molecules, which allows a function score (FS) to be calculated to assess the specificity of each enzyme. Potential of mean force (PMF) calculations were used to obtain the energies, and the resulting FS values demonstrate that a measurement of function may be obtained using differences between these PMF values. Additionally, limitations of this method are discussed based on: (a) larger substrates with significant conformational flexibility; (b) low homology enzymes; and (c) open active sites. This method should be useful in accurately predicting specificity for single enzymes that have multiple steps in their reactions and in high throughput computational methods to accurately annotate uncharacterized proteins based on active site interaction analysis. 2008 Wiley-Liss, Inc.
Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.
2010-01-01
A central goal of human genetics is to identify and characterize susceptibility genes for common complex human diseases. An important challenge in this endeavor is the modeling of gene-gene interaction or epistasis that can result in non-additivity of genetic effects. The multifactor dimensionality reduction (MDR) method was developed as machine learning alternative to parametric logistic regression for detecting interactions in absence of significant marginal effects. The goal of MDR is to reduce the dimensionality inherent in modeling combinations of polymorphisms using a computational approach called constructive induction. Here, we propose a Robust Multifactor Dimensionality Reduction (RMDR) method that performs constructive induction using a Fisher’s Exact Test rather than a predetermined threshold. The advantage of this approach is that only those genotype combinations that are determined to be statistically significant are considered in the MDR analysis. We use two simulation studies to demonstrate that this approach will increase the success rate of MDR when there are only a few genotype combinations that are significantly associated with case-control status. We show that there is no loss of success rate when this is not the case. We then apply the RMDR method to the detection of gene-gene interactions in genotype data from a population-based study of bladder cancer in New Hampshire. PMID:21091664
Entropy and generalized least square methods in assessment of the regional value of streamgages
Markus, M.; Vernon, Knapp H.; Tasker, Gary D.
2003-01-01
The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.
Chai, Hua; Li, Zi-Na; Meng, De-Yu; Xia, Liang-Yong; Liang, Yong
2017-10-12
Gene selection is an attractive and important task in cancer survival analysis. Most existing supervised learning methods can only use the labeled biological data, while the censored data (weakly labeled data) far more than the labeled data are ignored in model building. Trying to utilize such information in the censored data, a semi-supervised learning framework (Cox-AFT model) combined with Cox proportional hazard (Cox) and accelerated failure time (AFT) model was used in cancer research, which has better performance than the single Cox or AFT model. This method, however, is easily affected by noise. To alleviate this problem, in this paper we combine the Cox-AFT model with self-paced learning (SPL) method to more effectively employ the information in the censored data in a self-learning way. SPL is a kind of reliable and stable learning mechanism, which is recently proposed for simulating the human learning process to help the AFT model automatically identify and include samples of high confidence into training, minimizing interference from high noise. Utilizing the SPL method produces two direct advantages: (1) The utilization of censored data is further promoted; (2) the noise delivered to the model is greatly decreased. The experimental results demonstrate the effectiveness of the proposed model compared to the traditional Cox-AFT model.
NASA Astrophysics Data System (ADS)
Chang, Yufei; Hou, Hu; Li, Bafang
2016-06-01
Codfish is a kind of abyssal fish species with a great value in food industry. However, the flavor of codfish, especially the unpleasant odor, has caused serious problems in its processing. To accurately identify the volatile compounds in codfish, a combination of solid phase micro-extraction (SPME) method and simultaneous distillation extraction (SDE) method was used to extract the volatiles. Gas chromatography-mass spectrometry (GC-MS) along with Kovats indices (KI) and authentic standard compounds were used to identify the volatiles. The results showed that a total of 86 volatile compounds were identified in codfish, of them 24 were extracted by SDE, 69 compounds by SPME, and 10 compounds by both SDE and SPME. Seventy volatile compounds were found to have specific odors, of them 7 typical compounds contributed significantly to the flavor of codfish. Alcohols ( i.e., (E)-2-penten-1-ol and 2-octanol), esters ( i.e., ethyl butyrate and methyl geranate), aldehydes ( i.e., 2-dodecenal and pentadecanal) contributed the most to fresh flavor while nitrogen compounds, sulphur compounds, furans, as well as some ketones ( i.e., 2-hydroxy-3-pentanone) brought unpleasant odor, such as fishy and earthy odor. It was indicated that the combination of multiple extraction methods and GC-MS analysis can enhance the accuracy of identification, and provide a reference for the further study on flavor of aquatic products.
The Use of a Corpus in Contrastive Studies.
ERIC Educational Resources Information Center
Filipovic, Rudolf
1973-01-01
Before beginning the Serbocroatian-English Contrastive Project, it was necessary to determine whether to base the analysis on a corpus or on native intuitions. It seemed that the best method would combine the theoretical and the empirical. A translation method based on a corpus of text was adopted. The Brown University "Standard Sample of…
T.M. Barrett
2004-01-01
During the 1990s, forest inventories for California, Oregon, and Washington were conducted by different agencies using different methods. The Pacific Northwest Research Station Forest Inventory and Analysis program recently integrated these inventories into a single database. This document briefly describes potential statistical methods for estimating population totals...
ERIC Educational Resources Information Center
Potter, Penny F.; Graham-Moore, Brian E.
Most organizations planning to assess adverse impact or perform a stock analysis for affirmative action planning must correctly classify their jobs into appropriate occupational categories. Two methods of job classification were assessed in a combination archival and field study. Classification results from expert judgment of functional job…
A Simple and Inexpensive Capillary Holder for Thin-Layer Chromatography
ERIC Educational Resources Information Center
Pintea, Beniamin-Nicolae V.
2011-01-01
Thin-layer chromatography (TLC) is a widely used method of qualitative analysis in organic synthesis, as it uniquely combines low cost, rapidity, simplicity, versatility, small quantities of sample and low detection limits. The simplest and most economical method for the application of samples onto TLC plates is by hand, using glass capillaries.…