Sample records for incremental support vector

  1. Support vector machine incremental learning triggered by wrongly predicted samples

    NASA Astrophysics Data System (ADS)

    Tang, Ting-long; Guan, Qiu; Wu, Yi-rong

    2018-05-01

    According to the classic Karush-Kuhn-Tucker (KKT) theorem, at every step of incremental support vector machine (SVM) learning, the newly adding sample which violates the KKT conditions will be a new support vector (SV) and migrate the old samples between SV set and non-support vector (NSV) set, and at the same time the learning model should be updated based on the SVs. However, it is not exactly clear at this moment that which of the old samples would change between SVs and NSVs. Additionally, the learning model will be unnecessarily updated, which will not greatly increase its accuracy but decrease the training speed. Therefore, how to choose the new SVs from old sets during the incremental stages and when to process incremental steps will greatly influence the accuracy and efficiency of incremental SVM learning. In this work, a new algorithm is proposed to select candidate SVs and use the wrongly predicted sample to trigger the incremental processing simultaneously. Experimental results show that the proposed algorithm can achieve good performance with high efficiency, high speed and good accuracy.

  2. Product Quality Modelling Based on Incremental Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Wang, J.; Zhang, W.; Qin, B.; Shi, W.

    2012-05-01

    Incremental Support vector machine (ISVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. It is suitable for the problem of sequentially arriving field data and has been widely used for product quality prediction and production process optimization. However, the traditional ISVM learning does not consider the quality of the incremental data which may contain noise and redundant data; it will affect the learning speed and accuracy to a great extent. In order to improve SVM training speed and accuracy, a modified incremental support vector machine (MISVM) is proposed in this paper. Firstly, the margin vectors are extracted according to the Karush-Kuhn-Tucker (KKT) condition; then the distance from the margin vectors to the final decision hyperplane is calculated to evaluate the importance of margin vectors, where the margin vectors are removed while their distance exceed the specified value; finally, the original SVs and remaining margin vectors are used to update the SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also can preserve the important samples. The MISVM has been experimented on two public data and one field data of zinc coating weight in strip hot-dip galvanizing, and the results shows that the proposed method can improve the prediction accuracy and the training speed effectively. Furthermore, it can provide the necessary decision supports and analysis tools for auto control of product quality, and also can extend to other process industries, such as chemical process and manufacturing process.

  3. Incremental classification learning for anomaly detection in medical images

    NASA Astrophysics Data System (ADS)

    Giritharan, Balathasan; Yuan, Xiaohui; Liu, Jianguo

    2009-02-01

    Computer-aided diagnosis usually screens thousands of instances to find only a few positive cases that indicate probable presence of disease.The amount of patient data increases consistently all the time. In diagnosis of new instances, disagreement occurs between a CAD system and physicians, which suggests inaccurate classifiers. Intuitively, misclassified instances and the previously acquired data should be used to retrain the classifier. This, however, is very time consuming and, in some cases where dataset is too large, becomes infeasible. In addition, among the patient data, only a small percentile shows positive sign, which is known as imbalanced data.We present an incremental Support Vector Machines(SVM) as a solution for the class imbalance problem in classification of anomaly in medical images. The support vectors provide a concise representation of the distribution of the training data. Here we use bootstrapping to identify potential candidate support vectors for future iterations. Experiments were conducted using images from endoscopy videos, and the sensitivity and specificity were close to that of SVM trained using all samples available at a given incremental step with significantly improved efficiency in training the classifier.

  4. Modelling and Prediction of Spark-ignition Engine Power Performance Using Incremental Least Squares Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Wong, Pak-kin; Vong, Chi-man; Wong, Hang-cheong; Li, Ke

    2010-05-01

    Modern automotive spark-ignition (SI) power performance usually refers to output power and torque, and they are significantly affected by the setup of control parameters in the engine management system (EMS). EMS calibration is done empirically through tests on the dynamometer (dyno) because no exact mathematical engine model is yet available. With an emerging nonlinear function estimation technique of Least squares support vector machines (LS-SVM), the approximate power performance model of a SI engine can be determined by training the sample data acquired from the dyno. A novel incremental algorithm based on typical LS-SVM is also proposed in this paper, so the power performance models built from the incremental LS-SVM can be updated whenever new training data arrives. With updating the models, the model accuracies can be continuously increased. The predicted results using the estimated models from the incremental LS-SVM are good agreement with the actual test results and with the almost same average accuracy of retraining the models from scratch, but the incremental algorithm can significantly shorten the model construction time when new training data arrives.

  5. Bayesian Kernel Methods for Non-Gaussian Distributions: Binary and Multi-class Classification Problems

    DTIC Science & Technology

    2013-05-28

    those of the support vector machine and relevance vector machine, and the model runs more quickly than the other algorithms . When one class occurs...incremental support vector machine algorithm for online learning when fewer than 50 data points are available. (a) Papers published in peer-reviewed journals...learning environments, where data processing occurs one observation at a time and the classification algorithm improves over time with new

  6. Reduced Order Model Basis Vector Generation: Generates Basis Vectors fro ROMs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrighi, Bill

    2016-03-03

    libROM is a library that implements order reduction via singular value decomposition (SVD) of sampled state vectors. It implements 2 parallel, incremental SVD algorithms and one serial, non-incremental algorithm. It also provides a mechanism for adaptive sampling of basis vectors.

  7. Incremental Support Vector Machine Framework for Visual Sensor Networks

    NASA Astrophysics Data System (ADS)

    Awad, Mariette; Jiang, Xianhua; Motai, Yuichi

    2006-12-01

    Motivated by the emerging requirements of surveillance networks, we present in this paper an incremental multiclassification support vector machine (SVM) technique as a new framework for action classification based on real-time multivideo collected by homogeneous sites. The technique is based on an adaptation of least square SVM (LS-SVM) formulation but extends beyond the static image-based learning of current SVM methodologies. In applying the technique, an initial supervised offline learning phase is followed by a visual behavior data acquisition and an online learning phase during which the cluster head performs an ensemble of model aggregations based on the sensor nodes inputs. The cluster head then selectively switches on designated sensor nodes for future incremental learning. Combining sensor data offers an improvement over single camera sensing especially when the latter has an occluded view of the target object. The optimization involved alleviates the burdens of power consumption and communication bandwidth requirements. The resulting misclassification error rate, the iterative error reduction rate of the proposed incremental learning, and the decision fusion technique prove its validity when applied to visual sensor networks. Furthermore, the enabled online learning allows an adaptive domain knowledge insertion and offers the advantage of reducing both the model training time and the information storage requirements of the overall system which makes it even more attractive for distributed sensor networks communication.

  8. Non-metallic coating thickness prediction using artificial neural network and support vector machine with time resolved thermography

    NASA Astrophysics Data System (ADS)

    Wang, Hongjin; Hsieh, Sheng-Jen; Peng, Bo; Zhou, Xunfei

    2016-07-01

    A method without requirements on knowledge about thermal properties of coatings or those of substrates will be interested in the industrial application. Supervised machine learning regressions may provide possible solution to the problem. This paper compares the performances of two regression models (artificial neural networks (ANN) and support vector machines for regression (SVM)) with respect to coating thickness estimations made based on surface temperature increments collected via time resolved thermography. We describe SVM roles in coating thickness prediction. Non-dimensional analyses are conducted to illustrate the effects of coating thicknesses and various factors on surface temperature increments. It's theoretically possible to correlate coating thickness with surface increment. Based on the analyses, the laser power is selected in such a way: during the heating, the temperature increment is high enough to determine the coating thickness variance but low enough to avoid surface melting. Sixty-one pain-coated samples with coating thicknesses varying from 63.5 μm to 571 μm are used to train models. Hyper-parameters of the models are optimized by 10-folder cross validation. Another 28 sets of data are then collected to test the performance of the three methods. The study shows that SVM can provide reliable predictions of unknown data, due to its deterministic characteristics, and it works well when used for a small input data group. The SVM model generates more accurate coating thickness estimates than the ANN model.

  9. A Novel Unsupervised Adaptive Learning Method for Long-Term Electromyography (EMG) Pattern Recognition

    PubMed Central

    Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi

    2017-01-01

    Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle). PMID:28608824

  10. A Novel Unsupervised Adaptive Learning Method for Long-Term Electromyography (EMG) Pattern Recognition.

    PubMed

    Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi

    2017-06-13

    Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle).

  11. Bubble vector in automatic merging

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Butler, T. G.

    1987-01-01

    It is shown that it is within the capability of the DMAP language to build a set of vectors that can grow incrementally to be applied automatically and economically within a DMAP loop that serves to append sub-matrices that are generated within a loop to a core matrix. The method of constructing such vectors is explained.

  12. Boosted Regression Trees Outperforms Support Vector Machines in Predicting (Regional) Yields of Winter Wheat from Single and Cumulated Dekadal Spot-VGT Derived Normalized Difference Vegetation Indices

    NASA Astrophysics Data System (ADS)

    Stas, Michiel; Dong, Qinghan; Heremans, Stien; Zhang, Beier; Van Orshoven, Jos

    2016-08-01

    This paper compares two machine learning techniques to predict regional winter wheat yields. The models, based on Boosted Regression Trees (BRT) and Support Vector Machines (SVM), are constructed of Normalized Difference Vegetation Indices (NDVI) derived from low resolution SPOT VEGETATION satellite imagery. Three types of NDVI-related predictors were used: Single NDVI, Incremental NDVI and Targeted NDVI. BRT and SVM were first used to select features with high relevance for predicting the yield. Although the exact selections differed between the prefectures, certain periods with high influence scores for multiple prefectures could be identified. The same period of high influence stretching from March to June was detected by both machine learning methods. After feature selection, BRT and SVM models were applied to the subset of selected features for actual yield forecasting. Whereas both machine learning methods returned very low prediction errors, BRT seems to slightly but consistently outperform SVM.

  13. Using Hand Grip Force as a Correlate of Longitudinal Acceleration Comfort for Rapid Transit Trains

    PubMed Central

    Guo, Beiyuan; Gan, Weide; Fang, Weining

    2015-01-01

    Longitudinal acceleration comfort is one of the essential metrics used to evaluate the ride comfort of train. The aim of this study was to investigate the effectiveness of using hand grip force as a correlate of longitudinal acceleration comfort of rapid transit trains. In the paper, a motion simulation system was set up and a two-stage experiment was designed to investigate the role of the grip force on the longitudinal comfort of rapid transit trains. The results of the experiment show that the incremental grip force was linearly correlated with the longitudinal acceleration value, while the incremental grip force had no correlation with the direction of the longitudinal acceleration vector. The results also show that the effects of incremental grip force and acceleration duration on the longitudinal comfort of rapid transit trains were significant. Based on multiple regression analysis, a step function model was established to predict the longitudinal comfort of rapid transit trains using the incremental grip force and the acceleration duration. The feasibility and practicably of the model was verified by a field test. Furthermore, a comparative analysis shows that the motion simulation system and the grip force based model were valid to support the laboratory studies on the longitudinal comfort of rapid transit trains. PMID:26147730

  14. Attitude Determination Algorithm based on Relative Quaternion Geometry of Velocity Incremental Vectors for Cost Efficient AHRS Design

    NASA Astrophysics Data System (ADS)

    Lee, Byungjin; Lee, Young Jae; Sung, Sangkyung

    2018-05-01

    A novel attitude determination method is investigated that is computationally efficient and implementable in low cost sensor and embedded platform. Recent result on attitude reference system design is adapted to further develop a three-dimensional attitude determination algorithm through the relative velocity incremental measurements. For this, velocity incremental vectors, computed respectively from INS and GPS with different update rate, are compared to generate filter measurement for attitude estimation. In the quaternion-based Kalman filter configuration, an Euler-like attitude perturbation angle is uniquely introduced for reducing filter states and simplifying propagation processes. Furthermore, assuming a small angle approximation between attitude update periods, it is shown that the reduced order filter greatly simplifies the propagation processes. For performance verification, both simulation and experimental studies are completed. A low cost MEMS IMU and GPS receiver are employed for system integration, and comparison with the true trajectory or a high-grade navigation system demonstrates the performance of the proposed algorithm.

  15. Humanlike agents with posture planning ability

    NASA Astrophysics Data System (ADS)

    Jung, Moon R.; Badler, Norman I.

    1992-11-01

    Human body models are geometric structures which may be ultimately controlled by kinematically manipulating their joints, but for animation, it is desirable to control them in terms of task-level goals. We address a fundamental problem in achieving task-level postural goals: controlling massively redundant degrees of freedom. We reduce the degrees of freedom by introducing significant control points and vectors, e.g., pelvis forward vector, palm up vector, and torso up vector, etc. This reduced set of parameters are used to enumerate primitive motions and motion dependencies among them, and thus to select from a small set of alternative postures (e.g., bend versus squat to lower shoulder height). A plan for a given goal is found by incrementally constructing a goal/constraint set based on the given goal, motion dependencies, collision avoidance requirements, and discovered failures. Global postures satisfying a given goal/constraint set are determined with the help of incremental mental simulation which uses a robust inverse kinematics algorithm. The contributions of the present work are: (1) There is no need to specify beforehand the final goal configuration, which is unrealistic for the human body, and (2) the degrees of freedom problem becomes easier by representing body configurations in terms of `lumped' control parameters, that is, control points and vectors.

  16. Human-like agents with posture planning ability

    NASA Technical Reports Server (NTRS)

    Jung, Moon R.; Badler, Norman

    1992-01-01

    Human body models are geometric structures which may be ultimately controlled by kinematically manipulating their joints, but for animation, it is desirable to control them in terms of task-level goals. We address a fundamental problem in achieving task-level postural goals: controlling massively redundant degrees of freedom. We reduce the degrees of freedom by introducing significant control points and vectors, e.g., pelvis forward vector, palm up vector, and torso up vector, etc. This reduced set of parameters are used to enumerate primitive motions and motion dependencies among them, and thus to select from a small set of alternative postures (e.g., bend vs. squat to lower shoulder height). A plan for a given goal is found by incrementally constructing a goal/constraint set based on the given goal, motion dependencies, collision avoidance requirements, and discovered failures. Global postures satisfying a given goal/constraint set are determined with the help of incremental mental simulation which uses a robust inverse kinematics algorithm. The contributions of the present work are: (1) There is no need to specify beforehand the final goal configuration, which is unrealistic for the human body, and (2) the degrees of freedom problem becomes easier by representing body configurations in terms of 'lumped' control parameters, that is, control points and vectors.

  17. 39 CFR 3050.23 - Documentation supporting incremental cost estimates in the Postal Service's section 3652 report.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... incremental cost model shall be reported. ... 39 Postal Service 1 2010-07-01 2010-07-01 false Documentation supporting incremental cost... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.23 Documentation supporting incremental cost...

  18. Using support vector machine to predict beta- and gamma-turns in proteins.

    PubMed

    Hu, Xiuzhen; Li, Qianzhong

    2008-09-01

    By using the composite vector with increment of diversity, position conservation scoring function, and predictive secondary structures to express the information of sequence, a support vector machine (SVM) algorithm for predicting beta- and gamma-turns in the proteins is proposed. The 426 and 320 nonhomologous protein chains described by Guruprasad and Rajkumar (Guruprasad and Rajkumar J. Biosci 2000, 25,143) are used for training and testing the predictive model of the beta- and gamma-turns, respectively. The overall prediction accuracy and the Matthews correlation coefficient in 7-fold cross-validation are 79.8% and 0.47, respectively, for the beta-turns. The overall prediction accuracy in 5-fold cross-validation is 61.0% for the gamma-turns. These results are significantly higher than the other algorithms in the prediction of beta- and gamma-turns using the same datasets. In addition, the 547 and 823 nonhomologous protein chains described by Fuchs and Alix (Fuchs and Alix Proteins: Struct Funct Bioinform 2005, 59, 828) are used for training and testing the predictive model of the beta- and gamma-turns, and better results are obtained. This algorithm may be helpful to improve the performance of protein turns' prediction. To ensure the ability of the SVM method to correctly classify beta-turn and non-beta-turn (gamma-turn and non-gamma-turn), the receiver operating characteristic threshold independent measure curves are provided. (c) 2008 Wiley Periodicals, Inc.

  19. A Semisupervised Support Vector Machines Algorithm for BCI Systems

    PubMed Central

    Qin, Jianzhao; Li, Yuanqing; Sun, Wei

    2007-01-01

    As an emerging technology, brain-computer interfaces (BCIs) bring us new communication interfaces which translate brain activities into control signals for devices like computers, robots, and so forth. In this study, we propose a semisupervised support vector machine (SVM) algorithm for brain-computer interface (BCI) systems, aiming at reducing the time-consuming training process. In this algorithm, we apply a semisupervised SVM for translating the features extracted from the electrical recordings of brain into control signals. This SVM classifier is built from a small labeled data set and a large unlabeled data set. Meanwhile, to reduce the time for training semisupervised SVM, we propose a batch-mode incremental learning method, which can also be easily applied to the online BCI systems. Additionally, it is suggested in many studies that common spatial pattern (CSP) is very effective in discriminating two different brain states. However, CSP needs a sufficient labeled data set. In order to overcome the drawback of CSP, we suggest a two-stage feature extraction method for the semisupervised learning algorithm. We apply our algorithm to two BCI experimental data sets. The offline data analysis results demonstrate the effectiveness of our algorithm. PMID:18368141

  20. A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis

    NASA Astrophysics Data System (ADS)

    Khawaja, Taimoor Saleem

    A high-belief low-overhead Prognostics and Health Management (PHM) system is desired for online real-time monitoring of complex non-linear systems operating in a complex (possibly non-Gaussian) noise environment. This thesis presents a Bayesian Least Squares Support Vector Machine (LS-SVM) based framework for fault diagnosis and failure prognosis in nonlinear non-Gaussian systems. The methodology assumes the availability of real-time process measurements, definition of a set of fault indicators and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. An efficient yet powerful Least Squares Support Vector Machine (LS-SVM) algorithm, set within a Bayesian Inference framework, not only allows for the development of real-time algorithms for diagnosis and prognosis but also provides a solid theoretical framework to address key concepts related to classification for diagnosis and regression modeling for prognosis. SVM machines are founded on the principle of Structural Risk Minimization (SRM) which tends to find a good trade-off between low empirical risk and small capacity. The key features in SVM are the use of non-linear kernels, the absence of local minima, the sparseness of the solution and the capacity control obtained by optimizing the margin. The Bayesian Inference framework linked with LS-SVMs allows a probabilistic interpretation of the results for diagnosis and prognosis. Additional levels of inference provide the much coveted features of adaptability and tunability of the modeling parameters. The two main modules considered in this research are fault diagnosis and failure prognosis. With the goal of designing an efficient and reliable fault diagnosis scheme, a novel Anomaly Detector is suggested based on the LS-SVM machines. The proposed scheme uses only baseline data to construct a 1-class LS-SVM machine which, when presented with online data is able to distinguish between normal behavior and any abnormal or novel data during real-time operation. The results of the scheme are interpreted as a posterior probability of health (1 - probability of fault). As shown through two case studies in Chapter 3, the scheme is well suited for diagnosing imminent faults in dynamical non-linear systems. Finally, the failure prognosis scheme is based on an incremental weighted Bayesian LS-SVR machine. It is particularly suited for online deployment given the incremental nature of the algorithm and the quick optimization problem solved in the LS-SVR algorithm. By way of kernelization and a Gaussian Mixture Modeling (GMM) scheme, the algorithm can estimate "possibly" non-Gaussian posterior distributions for complex non-linear systems. An efficient regression scheme associated with the more rigorous core algorithm allows for long-term predictions, fault growth estimation with confidence bounds and remaining useful life (RUL) estimation after a fault is detected. The leading contributions of this thesis are (a) the development of a novel Bayesian Anomaly Detector for efficient and reliable Fault Detection and Identification (FDI) based on Least Squares Support Vector Machines, (b) the development of a data-driven real-time architecture for long-term Failure Prognosis using Least Squares Support Vector Machines, (c) Uncertainty representation and management using Bayesian Inference for posterior distribution estimation and hyper-parameter tuning, and finally (d) the statistical characterization of the performance of diagnosis and prognosis algorithms in order to relate the efficiency and reliability of the proposed schemes.

  1. A new approach to impulsive rendezvous near circular orbit

    NASA Astrophysics Data System (ADS)

    Carter, Thomas; Humi, Mayer

    2012-04-01

    A new approach is presented for the problem of planar optimal impulsive rendezvous of a spacecraft in an inertial frame near a circular orbit in a Newtonian gravitational field. The total characteristic velocity to be minimized is replaced by a related characteristic-value function and this related optimization problem can be solved in closed form. The solution of this problem is shown to approach the solution of the original problem in the limit as the boundary conditions approach those of a circular orbit. Using a form of primer-vector theory the problem is formulated in a way that leads to relatively easy calculation of the optimal velocity increments. A certain vector that can easily be calculated from the boundary conditions determines the number of impulses required for solution of the optimization problem and also is useful in the computation of these velocity increments. Necessary and sufficient conditions for boundary conditions to require exactly three nonsingular non-degenerate impulses for solution of the related optimal rendezvous problem, and a means of calculating these velocity increments are presented. A simple example of a three-impulse rendezvous problem is solved and the resulting trajectory is depicted. Optimal non-degenerate nonsingular two-impulse rendezvous for the related problem is found to consist of four categories of solutions depending on the four ways the primer vector locus intersects the unit circle. Necessary and sufficient conditions for each category of solutions are presented. The region of the boundary values that admit each category of solutions of the related problem are found, and in each case a closed-form solution of the optimal velocity increments is presented. Similar results are presented for the simpler optimal rendezvous that require only one-impulse. For brevity degenerate and singular solutions are not discussed in detail, but should be presented in a following study. Although this approach is thought to provide simpler computations than existing methods, its main contribution may be in establishing a new approach to the more general problem.

  2. Incremental Transductive Learning Approaches to Schistosomiasis Vector Classification

    NASA Astrophysics Data System (ADS)

    Fusco, Terence; Bi, Yaxin; Wang, Haiying; Browne, Fiona

    2016-08-01

    The key issues pertaining to collection of epidemic disease data for our analysis purposes are that it is a labour intensive, time consuming and expensive process resulting in availability of sparse sample data which we use to develop prediction models. To address this sparse data issue, we present the novel Incremental Transductive methods to circumvent the data collection process by applying previously acquired data to provide consistent, confidence-based labelling alternatives to field survey research. We investigated various reasoning approaches for semi-supervised machine learning including Bayesian models for labelling data. The results show that using the proposed methods, we can label instances of data with a class of vector density at a high level of confidence. By applying the Liberal and Strict Training Approaches, we provide a labelling and classification alternative to standalone algorithms. The methods in this paper are components in the process of reducing the proliferation of the Schistosomiasis disease and its effects.

  3. Launching the first postgraduate diploma in medical entomology and disease vector control in Pakistan.

    PubMed

    Rathor, H R; Mnzava, A; Bile, K M; Hafeez, A; Zaman, S

    2010-01-01

    The Health Services Academy has launched a 12-month postgraduate diploma course in medical entomology and disease vector control. The objective is to create a core of experts trained to prevent and control vector-borne diseases. The course is a response to the serious health and socioeconomic burden caused by a number of vector-borne diseases in Pakistan. The persistence, emergence and re-emergence of these diseases is mainly attributed to the scarcity of trained vector-control experts. The training course attempts to fill the gap in trained manpower and thus reduce the morbidity and mortality due to these diseases, resulting in incremental gains to public health. This paper aims to outline the steps taken to establish the course and the perceived challenges to be addressed in order to sustain its future implementation.

  4. Innovative dengue vector control interventions in Latin America: what do they cost?

    PubMed Central

    Basso, César; Beltrán-Ayala, Efraín; Mitchell-Foster, Kendra; Cortés, Sebastián; Manrique-Saide, Pablo; Guillermo-May, Guillermo; Carvalho de Lima, Edilmar

    2016-01-01

    Background Five studies were conducted in Fortaleza (Brazil), Girardot (Colombia), Machala (Ecuador), Acapulco (Mexico), and Salto (Uruguay) to assess dengue vector control interventions tailored to the context. The studies involved the community explicitly in the implementation, and focused on the most productive breeding places for Aedes aegypti. This article reports the cost analysis of these interventions. Methods We conducted the costing from the perspective of the vector control program. We collected data on quantities and unit costs of the resources used to deliver the interventions. Comparable information was requested for the routine activities. Cost items were classified, analyzed descriptively, and aggregated to calculate total costs, costs per house reached, and incremental costs. Results Cost per house of the interventions were $18.89 (Fortaleza), $21.86 (Girardot), $30.61 (Machala), $39.47 (Acapulco), and $6.98 (Salto). Intervention components that focused mainly on changes to the established vector control programs seem affordable; cost savings were identified in Salto (−21%) and the clean patio component in Machala (−12%). An incremental cost of 10% was estimated in Fortaleza. On the other hand, there were also completely new components that would require sizeable financial efforts (installing insecticide-treated nets in Girardot and Acapulco costs $16.97 and $24.96 per house, respectively). Conclusions The interventions are promising, seem affordable and may improve the cost profile of the established vector control programs. The costs of the new components could be considerable, and should be assessed in relation to the benefits in reduced dengue burden. PMID:26924235

  5. Rate determination from vector observations

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.

    1993-01-01

    Vector observations are a common class of attitude data provided by a wide variety of attitude sensors. Attitude determination from vector observations is a well-understood process and numerous algorithms such as the TRIAD algorithm exist. These algorithms require measurement of the line of site (LOS) vector to reference objects and knowledge of the LOS directions in some predetermined reference frame. Once attitude is determined, it is a simple matter to synthesize vehicle rate using some form of lead-lag filter, and then, use it for vehicle stabilization. Many situations arise, however, in which rate knowledge is required but knowledge of the nominal LOS directions are not available. This paper presents two methods for determining spacecraft angular rates from vector observations without a priori knowledge of the vector directions. The first approach uses an extended Kalman filter with a spacecraft dynamic model and a kinematic model representing the motion of the observed LOS vectors. The second approach uses a 'differential' TRIAD algorithm to compute the incremental direction cosine matrix, from which vehicle rate is then derived.

  6. Global Combat Support System - Army Increment 2 (GCSS-A Inc 2)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) Defense Acquisition...Secretary of Defense PB - President’s Budget RDT&E - Research, Development, Test, and Evaluation SAE - Service Acquisition Executive TBD - To Be...Date Assigned: Program Information Program Name Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) DoD Component Army Responsible

  7. Polarized object detection in crabs: a two-channel system.

    PubMed

    Basnak, Melanie Ailín; Pérez-Schuster, Verónica; Hermitte, Gabriela; Berón de Astrada, Martín

    2018-05-25

    Many animal species take advantage of polarization vision for vital tasks such as orientation, communication and contrast enhancement. Previous studies have suggested that decapod crustaceans use a two-channel polarization system for contrast enhancement. Here, we characterize the polarization contrast sensitivity in a grapsid crab . We estimated the polarization contrast sensitivity of the animals by quantifying both their escape response and changes in heart rate when presented with polarized motion stimuli. The motion stimulus consisted of an expanding disk with an 82 deg polarization difference between the object and the background. More than 90% of animals responded by freezing or trying to avoid the polarized stimulus. In addition, we co-rotated the electric vector (e-vector) orientation of the light from the object and background by increments of 30 deg and found that the animals' escape response varied periodically with a 90 deg period. Maximum escape responses were obtained for object and background e-vectors near the vertical and horizontal orientations. Changes in cardiac response showed parallel results but also a minimum response when e-vectors of object and background were shifted by 45 deg with respect to the maxima. These results are consistent with an orthogonal receptor arrangement for the detection of polarized light, in which two channels are aligned with the vertical and horizontal orientations. It has been hypothesized that animals with object-based polarization vision rely on a two-channel detection system analogous to that of color processing in dichromats. Our results, obtained by systematically varying the e-vectors of object and background, provide strong empirical support for this theoretical model of polarized object detection. © 2018. Published by The Company of Biologists Ltd.

  8. International Space Station Increment Operations Services

    NASA Astrophysics Data System (ADS)

    Michaelis, Horst; Sielaff, Christian

    2002-01-01

    The Industrial Operator (IO) has defined End-to-End services to perform efficiently all required operations tasks for the Manned Space Program (MSP) as agreed during the Ministerial Council in Edinburgh in November 2001. Those services are the result of a detailed task analysis based on the operations processes as derived from the Space Station Program Implementation Plans (SPIP) and defined in the Operations Processes Documents (OPD). These services are related to ISS Increment Operations and ATV Mission Operations. Each of these End-to-End services is typically characterised by the following properties: It has a clearly defined starting point, where all requirements on the end-product are fixed and associated performance metrics of the customer are well defined. It has a clearly defined ending point, when the product or service is delivered to the customer and accepted by him, according to the performance metrics defined at the start point. The implementation of the process might be restricted by external boundary conditions and constraints mutually agreed with the customer. As far as those are respected the IO has the free choice to select methods and means of implementation. The ISS Increment Operations Service (IOS) activities required for the MSP Exploitation program cover the complete increment specific cycle starting with the support to strategic planning and ending with the post increment evaluation. These activities are divided into sub-services including the following tasks: - ISS Planning Support covering the support to strategic and tactical planning up to the generation - Development &Payload Integration Support - ISS Increment Preparation - ISS Increment Execution These processes are tight together by the Increment Integration Management, which provides the planning and scheduling of all activities as well as the technical management of the overall process . The paper describes the entire End-to-End ISS Increment Operations service and the implementation to support the Columbus Flight 1E related increment and subsequent ISS increments. Special attention is paid to the implications caused by long term operations on hardware, software and operations personnel.

  9. Obstacle-avoiding navigation system

    DOEpatents

    Borenstein, Johann; Koren, Yoram; Levine, Simon P.

    1991-01-01

    A system for guiding an autonomous or semi-autonomous vehicle through a field of operation having obstacles thereon to be avoided employs a memory for containing data which defines an array of grid cells which correspond to respective subfields in the field of operation of the vehicle. Each grid cell in the memory contains a value which is indicative of the likelihood, or probability, that an obstacle is present in the respectively associated subfield. The values in the grid cells are incremented individually in response to each scan of the subfields, and precomputation and use of a look-up table avoids complex trigonometric functions. A further array of grid cells is fixed with respect to the vehicle form a conceptual active window which overlies the incremented grid cells. Thus, when the cells in the active window overly grid cell having values which are indicative of the presence of obstacles, the value therein is used as a multiplier of the precomputed vectorial values. The resulting plurality of vectorial values are summed vectorially in one embodiment of the invention to produce a virtual composite repulsive vector which is then summed vectorially with a target-directed vector for producing a resultant vector for guiding the vehicle. In an alternative embodiment, a plurality of vectors surrounding the vehicle are computed, each having a value corresponding to obstacle density. In such an embodiment, target location information is used to select between alternative directions of travel having low associated obstacle densities.

  10. The G-Axis: a growth vector for the mandible.

    PubMed

    Braun, Stanley; Kittleson, Russell; Kim, Kyonghwan

    2004-06-01

    On the basis of the G-point, defined as the center of the largest circle that is tangent to the internal inferior, anterior, and lingual surfaces of the mandibular symphysis in the sagittal view, a growth axis and its direction are described for each gender from age six to 19.25 years. Incremental growth along the G-Axis, defined by Sella-G-point, is described by regression formulas with correlation coefficients of 0.673 for female subjects and 0.749 for male subjects. The vector (direction) of the growth axis, defined by the angle alpha ((G-Axis)-(S-N)) does not materially alter in the age range studied. At age six in female subjects the angle alpha is 67.16 degrees +/- 3.03 degrees and at age 19.25 it is 66.87 degrees +/- 3.03 degrees, whereas in male subjects it is 66.12 degrees +/- 4.00 degrees and 67.93 degrees +/- 4.00 degrees, respectively. These changes and gender differences are not clinically significant. The data is based on 444 serial lateral cephalograms of 24 female subjects and 24 male subjects. The G-Axis incremental growth change and its vector offer an improved means of quantifying complex mandibular growth in the sagittal plane by using cephalometric measurements relative to and correlated with other craniofacial structures.

  11. The power induced effects module: A FORTRAN code which estimates lift increments due to power induced effects for V/STOL flight

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Howard, Kipp E.

    1991-01-01

    A user friendly FORTRAN code that can be used for preliminary design of V/STOL aircraft is described. The program estimates lift increments, due to power induced effects, encountered by aircraft in V/STOL flight. These lift increments are calculated using empirical relations developed from wind tunnel tests and are due to suckdown, fountain, ground vortex, jet wake, and the reaction control system. The code can be used as a preliminary design tool along with NASA Ames' Aircraft Synthesis design code or as a stand-alone program for V/STOL aircraft designers. The Power Induced Effects (PIE) module was validated using experimental data and data computed from lift increment routines. Results are presented for many flat plate models along with the McDonnell Aircraft Company's MFVT (mixed flow vectored thrust) V/STOL preliminary design and a 15 percent scale model of the YAV-8B Harrier V/STOL aircraft. Trends and magnitudes of lift increments versus aircraft height above the ground were predicted well by the PIE module. The code also provided good predictions of the magnitudes of lift increments versus aircraft forward velocity. More experimental results are needed to determine how well the code predicts lift increments as they vary with jet deflection angle and angle of attack. The FORTRAN code is provided in the appendix.

  12. Invariant-feature-based adaptive automatic target recognition in obscured 3D point clouds

    NASA Astrophysics Data System (ADS)

    Khuon, Timothy; Kershner, Charles; Mattei, Enrico; Alverio, Arnel; Rand, Robert

    2014-06-01

    Target recognition and classification in a 3D point cloud is a non-trivial process due to the nature of the data collected from a sensor system. The signal can be corrupted by noise from the environment, electronic system, A/D converter, etc. Therefore, an adaptive system with a desired tolerance is required to perform classification and recognition optimally. The feature-based pattern recognition algorithm architecture as described below is particularly devised for solving a single-sensor classification non-parametrically. Feature set is extracted from an input point cloud, normalized, and classifier a neural network classifier. For instance, automatic target recognition in an urban area would require different feature sets from one in a dense foliage area. The figure above (see manuscript) illustrates the architecture of the feature based adaptive signature extraction of 3D point cloud including LIDAR, RADAR, and electro-optical data. This network takes a 3D cluster and classifies it into a specific class. The algorithm is a supervised and adaptive classifier with two modes: the training mode and the performing mode. For the training mode, a number of novel patterns are selected from actual or artificial data. A particular 3D cluster is input to the network as shown above for the decision class output. The network consists of three sequential functional modules. The first module is for feature extraction that extracts the input cluster into a set of singular value features or feature vector. Then the feature vector is input into the feature normalization module to normalize and balance it before being fed to the neural net classifier for the classification. The neural net can be trained by actual or artificial novel data until each trained output reaches the declared output within the defined tolerance. In case new novel data is added after the neural net has been learned, the training is then resumed until the neural net has incrementally learned with the new novel data. The associative memory capability of the neural net enables the incremental learning. The back propagation algorithm or support vector machine can be utilized for the classification and recognition.

  13. The assisted prediction modelling frame with hybridisation and ensemble for business risk forecasting and an implementation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie

    2015-08-01

    The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.

  14. Multiaxis control power from thrust vectoring for a supersonic fighter aircraft model at Mach 0.20 to 2.47

    NASA Technical Reports Server (NTRS)

    Capone, Francis J.; Bare, E. Ann

    1987-01-01

    The aeropropulsive characteristics of an advanced twin-engine fighter aircraft designed for supersonic cruise have been studied in the Langley 16-Foot Tansonic Tunnel and the Lewis 10- by 10-Foot Supersonic Tunnel. The objective was to determine multiaxis control-power characteristics from thrust vectoring. A two-dimensional convergent-divergent nozzle was designed to provide yaw vector angles of 0, -10, and -20 deg combined with geometric pitch vector angles of 0 and 15 deg. Yaw thrust vectoring was provided by yaw flaps located in the nozzle sidewalls. Roll control was obtained from differential pitch vectoring. This investigation was conducted at Mach numbers from 0.20 to 2.47. Angle of attack was varied from 0 to about 19 deg, and nozzle pressure ratio was varied from about 1 (jet off) to 28, depending on Mach number. Increments in force or moment coefficient that result from pitch or yaw thrust vectoring remain essentially constant over the entire angle-of-attack range of all Mach numbers tested. There was no effect of pitch vectoring on the lateral aerodynamic forces and moments and only very small effects of yaw vectoring on the longitudinal aerodynamic forces and moments. This result indicates little cross-coupling of control forces and moments for combined pitch-yaw vectoring.

  15. Spin wave modes in out-of-plane magnetized nanorings

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Tartakovskaya, E. V.; Kakazei, G. N.; Adeyeye, A. O.

    2017-07-01

    We investigated the spin wave modes in flat circular permalloy rings with a canted external bias field using ferromagnetic resonance spectroscopy. The external magnetic field H was large enough to saturate the samples. For θ =0∘ (perpendicular geometry), three distinct resonance peaks were observed experimentally. In the case of the cylindrical symmetry violation due to H inclination from normal to the ring plane (the angle θ of H inclination was varied in the 0∘-6∘ range), the splitting of all initial peaks appeared. The distance between neighbor split peaks increased with the θ increment. Unexpectedly, the biggest splitting was observed for the mode with the smallest radial wave vector. This special feature of splitting behavior is determined by the topology of the ring shape. Developed analytical theory revealed that in perpendicular geometry, each observed peak is a combination of signals from the set of radially quantized spin wave excitation with almost the same radial wave vectors, radial profiles, and frequencies, but with different azimuthal dependencies. This degeneracy is a consequence of circular symmetry of the system and can be removed by H inclination from the normal. Our findings were further supported by micromagnetic simulations.

  16. A RLS-SVM Aided Fusion Methodology for INS during GPS Outages

    PubMed Central

    Yao, Yiqing; Xu, Xiaosu

    2017-01-01

    In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics. PMID:28245549

  17. A RLS-SVM Aided Fusion Methodology for INS during GPS Outages.

    PubMed

    Yao, Yiqing; Xu, Xiaosu

    2017-02-24

    In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics.

  18. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Ham, Yoo-Geun; Song, Hyo-Jong; Jung, Jaehee; Lim, Gyu-Ho

    2017-04-01

    This study introduces a altered version of the incremental analysis updates (IAU), called the nonstationary IAU (NIAU) method, to enhance the assimilation accuracy of the IAU while retaining the continuity of the analysis. Analogous to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. Still, unlike the IAU, the NIAU method applies time-evolved forcing employing the forward operator as rectifications to the model. The solution of the NIAU is better than that of the IAU, of which analysis is performed at the start of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  19. Research on bearing fault diagnosis of large machinery based on mathematical morphology

    NASA Astrophysics Data System (ADS)

    Wang, Yu

    2018-04-01

    To study the automatic diagnosis of large machinery fault based on support vector machine, combining the four common faults of the large machinery, the support vector machine is used to classify and identify the fault. The extracted feature vectors are entered. The feature vector is trained and identified by multi - classification method. The optimal parameters of the support vector machine are searched by trial and error method and cross validation method. Then, the support vector machine is compared with BP neural network. The results show that the support vector machines are short in time and high in classification accuracy. It is more suitable for the research of fault diagnosis in large machinery. Therefore, it can be concluded that the training speed of support vector machines (SVM) is fast and the performance is good.

  20. Engineering web maps with gradual content zoom based on streaming vector data

    NASA Astrophysics Data System (ADS)

    Huang, Lina; Meijers, Martijn; Šuba, Radan; van Oosterom, Peter

    2016-04-01

    Vario-scale data structures have been designed to support gradual content zoom and the progressive transfer of vector data, for use with arbitrary map scales. The focus to date has been on the server side, especially on how to convert geographic data into the proposed vario-scale structures by means of automated generalisation. This paper contributes to the ongoing vario-scale research by focusing on the client side and communication, particularly on how this works in a web-services setting. It is claimed that these functionalities are urgently needed, as many web-based applications, both desktop and mobile, require gradual content zoom, progressive transfer and a high performance level. The web-client prototypes developed in this paper make it possible to assess the behaviour of vario-scale data and to determine how users will actually see the interactions. Several different options of web-services communication architectures are possible in a vario-scale setting. These options are analysed and tested with various web-client prototypes, with respect to functionality, ease of implementation and performance (amount of transmitted data and response times). We show that the vario-scale data structure can fit in with current web-based architectures and efforts to standardise map distribution on the internet. However, to maximise the benefits of vario-scale data, a client needs to be aware of this structure. When a client needs a map to be refined (by means of a gradual content zoom operation), only the 'missing' data will be requested. This data will be sent incrementally to the client from a server. In this way, the amount of data transferred at one time is reduced, shortening the transmission time. In addition to these conceptual architecture aspects, there are many implementation and tooling design decisions at play. These will also be elaborated on in this paper. Based on the experiments conducted, we conclude that the vario-scale approach indeed supports gradual content zoom and the progressive web transfer of vector data. This is a big step forward in making vector data at arbitrary map scales available to larger user groups.

  1. Prediction of monthly rainfall in Victoria, Australia: Clusterwise linear regression approach

    NASA Astrophysics Data System (ADS)

    Bagirov, Adil M.; Mahmood, Arshad; Barton, Andrew

    2017-05-01

    This paper develops the Clusterwise Linear Regression (CLR) technique for prediction of monthly rainfall. The CLR is a combination of clustering and regression techniques. It is formulated as an optimization problem and an incremental algorithm is designed to solve it. The algorithm is applied to predict monthly rainfall in Victoria, Australia using rainfall data with five input meteorological variables over the period of 1889-2014 from eight geographically diverse weather stations. The prediction performance of the CLR method is evaluated by comparing observed and predicted rainfall values using four measures of forecast accuracy. The proposed method is also compared with the CLR using the maximum likelihood framework by the expectation-maximization algorithm, multiple linear regression, artificial neural networks and the support vector machines for regression models using computational results. The results demonstrate that the proposed algorithm outperforms other methods in most locations.

  2. A Compendium of Wind Statistics and Models for the NASA Space Shuttle and Other Aerospace Vehicle Programs

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Adelfang, S. I.

    1998-01-01

    The wind profile with all of its variations with respect to altitude has been, is now, and will continue to be important for aerospace vehicle design and operations. Wind profile databases and models are used for the vehicle ascent flight design for structural wind loading, flight control systems, performance analysis, and launch operations. This report presents the evolution of wind statistics and wind models from the empirical scalar wind profile model established for the Saturn Program through the development of the vector wind profile model used for the Space Shuttle design to the variations of this wind modeling concept for the X-33 program. Because wind is a vector quantity, the vector wind models use the rigorous mathematical probability properties of the multivariate normal probability distribution. When the vehicle ascent steering commands (ascent guidance) are wind biased to the wind profile measured on the day-of-launch, ascent structural wind loads are reduced and launch probability is increased. This wind load alleviation technique is recommended in the initial phase of vehicle development. The vehicle must fly through the largest load allowable versus altitude to achieve its mission. The Gumbel extreme value probability distribution is used to obtain the probability of exceeding (or not exceeding) the load allowable. The time conditional probability function is derived from the Gumbel bivariate extreme value distribution. This time conditional function is used for calculation of wind loads persistence increments using 3.5-hour Jimsphere wind pairs. These increments are used to protect the commit-to-launch decision. Other topics presented include the Shuttle Shuttle load-response to smoothed wind profiles, a new gust model, and advancements in wind profile measuring systems. From the lessons learned and knowledge gained from past vehicle programs, the development of future launch vehicles can be accelerated. However, new vehicle programs by their very nature will require specialized support for new databases and analyses for wind, atmospheric parameters (pressure, temperature, and density versus altitude), and weather. It is for this reason that project managers are encouraged to collaborate with natural environment specialists early in the conceptual design phase. Such action will give the lead time necessary to meet the natural environment design and operational requirements, and thus, reduce development costs.

  3. Progressive simplification and transmission of building polygons based on triangle meshes

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu

    2010-11-01

    Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.

  4. TargetM6A: Identifying N6-Methyladenosine Sites From RNA Sequences via Position-Specific Nucleotide Propensities and a Support Vector Machine.

    PubMed

    Li, Guang-Qing; Liu, Zi; Shen, Hong-Bin; Yu, Dong-Jun

    2016-10-01

    As one of the most ubiquitous post-transcriptional modifications of RNA, N 6 -methyladenosine ( [Formula: see text]) plays an essential role in many vital biological processes. The identification of [Formula: see text] sites in RNAs is significantly important for both basic biomedical research and practical drug development. In this study, we designed a computational-based method, called TargetM6A, to rapidly and accurately target [Formula: see text] sites solely from the primary RNA sequences. Two new features, i.e., position-specific nucleotide/dinucleotide propensities (PSNP/PSDP), are introduced and combined with the traditional nucleotide composition (NC) feature to formulate RNA sequences. The extracted features are further optimized to obtain a much more compact and discriminative feature subset by applying an incremental feature selection (IFS) procedure. Based on the optimized feature subset, we trained TargetM6A on the training dataset with a support vector machine (SVM) as the prediction engine. We compared the proposed TargetM6A method with existing methods for predicting [Formula: see text] sites by performing stringent jackknife tests and independent validation tests on benchmark datasets. The experimental results show that the proposed TargetM6A method outperformed the existing methods for predicting [Formula: see text] sites and remarkably improved the prediction performances, with MCC = 0.526 and AUC = 0.818. We also provided a user-friendly web server for TargetM6A, which is publicly accessible for academic use at http://csbio.njust.edu.cn/bioinf/TargetM6A.

  5. Attachment, social support, and responses following the death of a companion animal.

    PubMed

    King, Loren C; Werner, Paul D

    This research tested hypotheses concerning attachment, social support, and grief responses to the loss of animal companionship. Participants whose companion cat or dog had recently died (N = 429) completed the Attachment Style Questionnaire, the Inventory of Complicated Grief, and the Multidimensional Health Profile-Psychosocial Functioning questionnaires. Both attachment anxiety and attachment avoidance were found to be positively associated with respondents' grief, depression, anxiety, and somatic symptoms. Social support was found to be negatively associated with these outcomes as well as with attachment anxiety and attachment avoidance. In multiple regression analyses, attachment anxiety incrementally predicted grief, anxiety and somatic symptoms, attachment avoidance incrementally predicted grief and depression, and social support incrementally predicted all outcomes. Interaction effects of attachment and social support in relation to outcomes were not found. The present study's implications and limitations are discussed, as are directions for future research.

  6. "Lollipop-shaped" high-sensitivity Microelectromechanical Systems vector hydrophone based on Parylene encapsulation

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Wang, Renxin; Zhang, Guojun; Du, Jin; Zhao, Long; Xue, Chenyang; Zhang, Wendong; Liu, Jun

    2015-07-01

    This paper presents methods of promoting the sensitivity of Microelectromechanical Systems (MEMS) vector hydrophone by increasing the sensing area of cilium and perfect insulative Parylene membrane. First, a low-density sphere is integrated with the cilium to compose a "lollipop shape," which can considerably increase the sensing area. A mathematic model on the sensitivity of the "lollipop-shaped" MEMS vector hydrophone is presented, and the influences of different structural parameters on the sensitivity are analyzed via simulation. Second, the MEMS vector hydrophone is encapsulated through the conformal deposition of insulative Parylene membrane, which enables underwater acoustic monitoring without any typed sound-transparent encapsulation. Finally, the characterization results demonstrate that the sensitivity reaches up to -183 dB (500 Hz 0dB at 1 V/ μPa ), which is increased by more than 10 dB, comparing with the previous cilium-shaped MEMS vector hydrophone. Besides, the frequency response takes on a sensitivity increment of 6 dB per octave. The working frequency band is 20-500 Hz and the concave point depth of 8-shaped directivity is beyond 30 dB, indicating that the hydrophone is promising in underwater acoustic application.

  7. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  8. Fast Query-Optimized Kernel-Machine Classification

    NASA Technical Reports Server (NTRS)

    Mazzoni, Dominic; DeCoste, Dennis

    2004-01-01

    A recently developed algorithm performs kernel-machine classification via incremental approximate nearest support vectors. The algorithm implements support-vector machines (SVMs) at speeds 10 to 100 times those attainable by use of conventional SVM algorithms. The algorithm offers potential benefits for classification of images, recognition of speech, recognition of handwriting, and diverse other applications in which there are requirements to discern patterns in large sets of data. SVMs constitute a subset of kernel machines (KMs), which have become popular as models for machine learning and, more specifically, for automated classification of input data on the basis of labeled training data. While similar in many ways to k-nearest-neighbors (k-NN) models and artificial neural networks (ANNs), SVMs tend to be more accurate. Using representations that scale only linearly in the numbers of training examples, while exploring nonlinear (kernelized) feature spaces that are exponentially larger than the original input dimensionality, KMs elegantly and practically overcome the classic curse of dimensionality. However, the price that one must pay for the power of KMs is that query-time complexity scales linearly with the number of training examples, making KMs often orders of magnitude more computationally expensive than are ANNs, decision trees, and other popular machine learning alternatives. The present algorithm treats an SVM classifier as a special form of a k-NN. The algorithm is based partly on an empirical observation that one can often achieve the same classification as that of an exact KM by using only small fraction of the nearest support vectors (SVs) of a query. The exact KM output is a weighted sum over the kernel values between the query and the SVs. In this algorithm, the KM output is approximated with a k-NN classifier, the output of which is a weighted sum only over the kernel values involving k selected SVs. Before query time, there are gathered statistics about how misleading the output of the k-NN model can be, relative to the outputs of the exact KM for a representative set of examples, for each possible k from 1 to the total number of SVs. From these statistics, there are derived upper and lower thresholds for each step k. These thresholds identify output levels for which the particular variant of the k-NN model already leans so strongly positively or negatively that a reversal in sign is unlikely, given the weaker SV neighbors still remaining. At query time, the partial output of each query is incrementally updated, stopping as soon as it exceeds the predetermined statistical thresholds of the current step. For an easy query, stopping can occur as early as step k = 1. For more difficult queries, stopping might not occur until nearly all SVs are touched. A key empirical observation is that this approach can tolerate very approximate nearest-neighbor orderings. In experiments, SVs and queries were projected to a subspace comprising the top few principal- component dimensions and neighbor orderings were computed in that subspace. This approach ensured that the overhead of the nearest-neighbor computations was insignificant, relative to that of the exact KM computation.

  9. A study on the uniqueness of the plastic flow direction for granular assemblies of ductile particles using discrete finite-element simulations

    NASA Astrophysics Data System (ADS)

    Abdelmoula, Nouha; Harthong, Barthélémy; Imbault, Didier; Dorémus, Pierre

    2017-12-01

    The multi-particle finite element method involving assemblies of meshed particles interacting through finite-element contact conditions is adopted to study the plastic flow of a granular material with highly deformable elastic-plastic grains. In particular, it is investigated whether the flow rule postulate applies for such materials. Using a spherical stress probing method, the influence of incremental stress on plastic strain increment vectors was assessed for numerical samples compacted along two different loading paths up to different values of relative density. Results show that the numerical samples studied behave reasonably well according to an associated flow rule, except in the vicinity of the loading point where the influence of the stress increment proved to be very significant. A plausible explanation for the non-uniqueness of the direction of plastic flow is proposed, based on the idea that the resistance of the numerical sample to plastic straining can vary by an order of magnitude depending on the direction of the accumulated stress. The above-mentioned dependency of the direction of plastic flow on the direction of the stress increment was related to the difference in strength between shearing and normal stressing at the scale of contact surfaces between particles.

  10. Global Combat Support System - Joint Increment 8 (GCSS-J Inc 8)

    DTIC Science & Technology

    2016-03-01

    Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY - Fiscal...Estimate (Or Actual) Milestone B1 Mar 2014 Mar 2014 Milestone C1 Mar 2014 Mar 2014 Increment 8 FDD Dec 2018 Dec 2018 Increment 8 FD TBD TBD Memo 1

  11. A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment

    NASA Astrophysics Data System (ADS)

    Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong

    Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.

  12. Robust support vector regression networks for function approximation with outliers.

    PubMed

    Chuang, Chen-Chia; Su, Shun-Feng; Jeng, Jin-Tsong; Hsiao, Chih-Ching

    2002-01-01

    Support vector regression (SVR) employs the support vector machine (SVM) to tackle problems of function approximation and regression estimation. SVR has been shown to have good robust properties against noise. When the parameters used in SVR are improperly selected, overfitting phenomena may still occur. However, the selection of various parameters is not straightforward. Besides, in SVR, outliers may also possibly be taken as support vectors. Such an inclusion of outliers in support vectors may lead to seriously overfitting phenomena. In this paper, a novel regression approach, termed as the robust support vector regression (RSVR) network, is proposed to enhance the robust capability of SVR. In the approach, traditional robust learning approaches are employed to improve the learning performance for any selected parameters. From the simulation results, our RSVR can always improve the performance of the learned systems for all cases. Besides, it can be found that even the training lasted for a long period, the testing errors would not go up. In other words, the overfitting phenomenon is indeed suppressed.

  13. Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.

    PubMed

    Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi

    2013-01-01

    The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.

  14. Currency crisis indication by using ensembles of support vector machine classifiers

    NASA Astrophysics Data System (ADS)

    Ramli, Nor Azuana; Ismail, Mohd Tahir; Wooi, Hooy Chee

    2014-07-01

    There are many methods that had been experimented in the analysis of currency crisis. However, not all methods could provide accurate indications. This paper introduces an ensemble of classifiers by using Support Vector Machine that's never been applied in analyses involving currency crisis before with the aim of increasing the indication accuracy. The proposed ensemble classifiers' performances are measured using percentage of accuracy, root mean squared error (RMSE), area under the Receiver Operating Characteristics (ROC) curve and Type II error. The performances of an ensemble of Support Vector Machine classifiers are compared with the single Support Vector Machine classifier and both of classifiers are tested on the data set from 27 countries with 12 macroeconomic indicators for each country. From our analyses, the results show that the ensemble of Support Vector Machine classifiers outperforms single Support Vector Machine classifier on the problem involving indicating a currency crisis in terms of a range of standard measures for comparing the performance of classifiers.

  15. Prediction of rat protein subcellular localization with pseudo amino acid composition based on multiple sequential features.

    PubMed

    Shi, Ruijia; Xu, Cunshuan

    2011-06-01

    The study of rat proteins is an indispensable task in experimental medicine and drug development. The function of a rat protein is closely related to its subcellular location. Based on the above concept, we construct the benchmark rat proteins dataset and develop a combined approach for predicting the subcellular localization of rat proteins. From protein primary sequence, the multiple sequential features are obtained by using of discrete Fourier analysis, position conservation scoring function and increment of diversity, and these sequential features are selected as input parameters of the support vector machine. By the jackknife test, the overall success rate of prediction is 95.6% on the rat proteins dataset. Our method are performed on the apoptosis proteins dataset and the Gram-negative bacterial proteins dataset with the jackknife test, the overall success rates are 89.9% and 96.4%, respectively. The above results indicate that our proposed method is quite promising and may play a complementary role to the existing predictors in this area.

  16. SVR-based prediction of carbon emissions from energy consumption in Henan Province

    NASA Astrophysics Data System (ADS)

    Gou, Guohua

    2018-02-01

    This paper analyzes the advantage of support vector regression (SVR) in the prediction of carbon emission and establishes the SVR-based carbon emission prediction model. The model is established using the data of Henan’s carbon emissions and influence factors from the 1991 to 2016 to train and test and then predict the carbon emissions from 2017 to 2021. The results show that: from the perspective of carbon emission from energy consumption, it raised 224.876 million tons of carbon dioxide from 1991 to 2016, and the predicted increment from 2017 to 2021 is 30.5563million tons with an average annual growth rate at 3%. From the perspective of growth rate among the six factors related to carbon emissions it is proved that population urbanization rate per capital GDP and energy consumption per unit of GDP influences the growth rate of carbon emissions less than the proportion of secondary industry and coal consumption ratio of carbon. Finally some suggestions are proposed for the carbon emission reduction of Henan Province.

  17. International Space Station Increment-4/5 Microgravity Environment Summary Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy

    2003-01-01

    This summary report presents the results of some of the processed acceleration data measured aboard the International Space Station during the period of December 2001 to December 2002. Unlike the past two ISS Increment reports, which were increment specific, this summary report covers two increments: Increments 4 and 5, hereafter referred to as Increment-4/5. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-4/5. Due to time constraint and lack of precise timeline information regarding some payload operations and station activities, not a11 of the activities were analyzed for this report. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System supports science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit supports experiments requiring vibratory acceleration measurement. The International Space Station Increment-4/5 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: The Microgravity Acceleration Measurement System, which consists of two sensors: the low-frequency Orbital Acceleration Research Experiment Sensor Subsystem and the higher frequency High Resolution Accelerometer Package. The low frequency sensor measures up to 1 Hz, but is routinely trimmean filtered to yield much lower frequency acceleration data up to 0.01 Hz. This filtered data can be mapped to arbitrary locations for characterizing the quasi-steady environment for payloads and the vehicle. The high frequency sensor is used to characterize the vibratory environment up to 100 Hz at a single measurement location. The Space Acceleration Measurement System, which deploys high frequency sensors, measures vibratory acceleration data in the range of 0.01 to 400 Hz at multiple measurement locations. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment- 4/5 from December 2001 to December 2002.

  18. Incremental impact upon malaria transmission of supplementing pyrethroid-impregnated long-lasting insecticidal nets with indoor residual spraying using pyrethroids or the organophosphate, pirimiphos methyl.

    PubMed

    Hamainza, Busiku; Sikaala, Chadwick H; Moonga, Hawela B; Chanda, Javan; Chinula, Dingani; Mwenda, Mulenga; Kamuliwo, Mulakwa; Bennett, Adam; Seyoum, Aklilu; Killeen, Gerry F

    2016-02-18

    Long-lasting, insecticidal nets (LLINs) and indoor residual spraying (IRS) are the most widely accepted and applied malaria vector control methods. However, evidence that incremental impact is achieved when they are combined remains limited and inconsistent. Fourteen population clusters of approximately 1000 residents each in Zambia's Luangwa and Nyimba districts, which had high pre-existing usage rates (81.7 %) of pyrethroid-impregnated LLINs were quasi-randomly assigned to receive IRS with either of two pyrethroids, namely deltamethrin [Wetable granules (WG)] and lambdacyhalothrin [capsule suspension (CS)], with an emulsifiable concentrate (EC) or CS formulation of the organophosphate pirimiphos methyl (PM), or with no supplementary vector control measure. Diagnostic positivity of patients tested for malaria by community health workers in these clusters was surveyed longitudinally over pre- and post-treatment periods spanning 29 months, over which the treatments were allocated and re-allocated in advance of three sequential rainy seasons. Supplementation of LLINs with PM CS offered the greatest initial level of protection against malaria in the first 3 months of application (incremental protective efficacy (IPE) [95 % confidence interval (CI)] = 0.63 [CI 0.57, 0.69], P < 0.001), followed by lambdacyhalothrin (IPE [95 % CI] = 0.31 [0.10, 0.47], P = 0.006) and PM EC (IPE, 0.23 [CI 0.15, 0.31], P < 0.001) and then by deltamethrin (IPE [95 % CI] = 0.19 [-0.01, 0.35], P = 0.064). Neither pyrethroid formulation provided protection beyond 3 months after spraying, but the protection provided by both PM formulations persisted undiminished for longer periods: 6 months for CS and 12 months for EC. The CS formulation of PM provided greater protection than the combined pyrethroid IRS formulations throughout its effective life IPE [95 % CI] = 0.79 [0.75, 0.83] over 6 months. The EC formulation of PM provided incremental protection for the first 3 months (IPE [95 % CI] = 0.23 [0.15, 0.31]) that was approximately equivalent to the two pyrethroid formulations (lambdacyhalothrin, IPE [95 % CI] = 0.31 [0.10, 0.47] and deltamethrin, IPE [95 % CI] = 0.19 [-0.01, 0.35]) but the additional protection provided by the former, apparently lasted an entire year. Where universal coverage targets for LLIN utilization has been achieved, supplementing LLINs with IRS using pyrethroids may reduce malaria transmission below levels achieved by LLIN use alone, even in settings where pyrethroid resistance occurs in the vector population. However, far greater reduction of transmission can be achieved under such conditions by supplementing LLINs with IRS using non-pyrethroid insecticide classes, such as organophosphates, so this is a viable approach to mitigating and managing pyrethroid resistance.

  19. Identification of high shears and compressive discontinuities in the inner heliosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greco, A.; Perri, S.

    2014-04-01

    Two techniques, the Partial Variance of Increments (PVI) and the Local Intermittency Measure (LIM), have been applied and compared using MESSENGER magnetic field data in the solar wind at a heliocentric distance of about 0.3 AU. The spatial properties of the turbulent field at different scales, spanning the whole inertial range of magnetic turbulence down toward the proton scales have been studied. LIM and PVI methodologies allow us to identify portions of an entire time series where magnetic energy is mostly accumulated, and regions of intermittent bursts in the magnetic field vector increments, respectively. A statistical analysis has revealed thatmore » at small time scales and for high level of the threshold, the bursts present in the PVI and the LIM series correspond to regions of high shear stress and high magnetic field compressibility.« less

  20. TWSVR: Regression via Twin Support Vector Machine.

    PubMed

    Khemchandani, Reshma; Goyal, Keshav; Chandra, Suresh

    2016-02-01

    Taking motivation from Twin Support Vector Machine (TWSVM) formulation, Peng (2010) attempted to propose Twin Support Vector Regression (TSVR) where the regressor is obtained via solving a pair of quadratic programming problems (QPPs). In this paper we argue that TSVR formulation is not in the true spirit of TWSVM. Further, taking motivation from Bi and Bennett (2003), we propose an alternative approach to find a formulation for Twin Support Vector Regression (TWSVR) which is in the true spirit of TWSVM. We show that our proposed TWSVR can be derived from TWSVM for an appropriately constructed classification problem. To check the efficacy of our proposed TWSVR we compare its performance with TSVR and classical Support Vector Regression(SVR) on various regression datasets. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Multiscale asymmetric orthogonal wavelet kernel for linear programming support vector learning and nonlinear dynamic systems identification.

    PubMed

    Lu, Zhao; Sun, Jing; Butts, Kenneth

    2014-05-01

    Support vector regression for approximating nonlinear dynamic systems is more delicate than the approximation of indicator functions in support vector classification, particularly for systems that involve multitudes of time scales in their sampled data. The kernel used for support vector learning determines the class of functions from which a support vector machine can draw its solution, and the choice of kernel significantly influences the performance of a support vector machine. In this paper, to bridge the gap between wavelet multiresolution analysis and kernel learning, the closed-form orthogonal wavelet is exploited to construct new multiscale asymmetric orthogonal wavelet kernels for linear programming support vector learning. The closed-form multiscale orthogonal wavelet kernel provides a systematic framework to implement multiscale kernel learning via dyadic dilations and also enables us to represent complex nonlinear dynamics effectively. To demonstrate the superiority of the proposed multiscale wavelet kernel in identifying complex nonlinear dynamic systems, two case studies are presented that aim at building parallel models on benchmark datasets. The development of parallel models that address the long-term/mid-term prediction issue is more intricate and challenging than the identification of series-parallel models where only one-step ahead prediction is required. Simulation results illustrate the effectiveness of the proposed multiscale kernel learning.

  2. Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections

    NASA Astrophysics Data System (ADS)

    Wakazuki, Y.

    2015-12-01

    A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.

  3. A Code Generation Approach for Auto-Vectorization in the Spade Compiler

    NASA Astrophysics Data System (ADS)

    Wang, Huayong; Andrade, Henrique; Gedik, Buğra; Wu, Kun-Lung

    We describe an auto-vectorization approach for the Spade stream processing programming language, comprising two ideas. First, we provide support for vectors as a primitive data type. Second, we provide a C++ library with architecture-specific implementations of a large number of pre-vectorized operations as the means to support language extensions. We evaluate our approach with several stream processing operators, contrasting Spade's auto-vectorization with the native auto-vectorization provided by the GNU gcc and Intel icc compilers.

  4. Competition in Weapon Systems Acquisition: Cost Analyses of Some Issues

    DTIC Science & Technology

    1990-09-01

    10% increments , also known as the step-ladder bids) submitted by the contractor in the first year of dual source procurement. The triangles represent...savings by subtracting annual incremental government costs, stated in constant dollars, from (3). (5) Estimate nonrecurring start-up costs, stated in...constant dollars, by fiscal year. (6) Estimate incremental logistic support costs, stated in constant dollars. by fiscal year. (7) Calculate a net

  5. Signal detection using support vector machines in the presence of ultrasonic speckle

    NASA Astrophysics Data System (ADS)

    Kotropoulos, Constantine L.; Pitas, Ioannis

    2002-04-01

    Support Vector Machines are a general algorithm based on guaranteed risk bounds of statistical learning theory. They have found numerous applications, such as in classification of brain PET images, optical character recognition, object detection, face verification, text categorization and so on. In this paper we propose the use of support vector machines to segment lesions in ultrasound images and we assess thoroughly their lesion detection ability. We demonstrate that trained support vector machines with a Radial Basis Function kernel segment satisfactorily (unseen) ultrasound B-mode images as well as clinical ultrasonic images.

  6. Bayesian Hierarchical Model Characterization of Model Error in Ocean Data Assimilation and Forecasts

    DTIC Science & Technology

    2013-09-30

    proof-of-concept results comparing a BHM surface wind ensemble with the increments in the surface momentum flux control vector in a four-dimensional...Surface   Momentum  Flux  Ensembles  from  Summaries  of  BHM  Winds  (Mediterranean)   include  ocean  current  effect   Td...Bayesian Hierarchical Model to provide surface momentum flux ensembles. 3 Figure 2: Domain of interest : squares indicate spatial locations where

  7. Bayesian Hierarchical Model Characterization of Model Error in Ocean Data Assimilation and Forecasts

    DTIC Science & Technology

    2013-09-30

    wind ensemble with the increments in the surface momentum flux control vector in a four-dimensional variational (4dvar) assimilation system. The...stability  effects?   surface  stress   Surface   Momentum  Flux  Ensembles  from  Summaries  of  BHM  Winds  (Mediterranean...surface wind speed given ensemble winds from a Bayesian Hierarchical Model to provide surface momentum flux ensembles. 3 Figure 2: Domain of

  8. Support Vector Machines Model of Computed Tomography for Assessing Lymph Node Metastasis in Esophageal Cancer with Neoadjuvant Chemotherapy.

    PubMed

    Wang, Zhi-Long; Zhou, Zhi-Guo; Chen, Ying; Li, Xiao-Ting; Sun, Ying-Shi

    The aim of this study was to diagnose lymph node metastasis of esophageal cancer by support vector machines model based on computed tomography. A total of 131 esophageal cancer patients with preoperative chemotherapy and radical surgery were included. Various indicators (tumor thickness, tumor length, tumor CT value, total number of lymph nodes, and long axis and short axis sizes of largest lymph node) on CT images before and after neoadjuvant chemotherapy were recorded. A support vector machines model based on these CT indicators was built to predict lymph node metastasis. Support vector machines model diagnosed lymph node metastasis better than preoperative short axis size of largest lymph node on CT. The area under the receiver operating characteristic curves were 0.887 and 0.705, respectively. The support vector machine model of CT images can help diagnose lymph node metastasis in esophageal cancer with preoperative chemotherapy.

  9. The design of transfer trajectory for Ivar asteroid exploration mission

    NASA Astrophysics Data System (ADS)

    Qiao, Dong; Cui, Hutao; Cui, Pingyuan

    2009-12-01

    An impending demand for exploring the small bodies, such as the comets and the asteroids, envisioned the Chinese Deep Space exploration mission to the Near Earth asteroid Ivar. A design and optimal method of transfer trajectory for asteroid Ivar is discussed in this paper. The transfer trajectory for rendezvous with asteroid Ivar is designed by means of Earth gravity assist with deep space maneuver (Delta-VEGA) technology. A Delta-VEGA transfer trajectory is realized by several trajectory segments, which connect the deep space maneuver and swingby point. Each trajectory segment is found by solving Lambert problem. Through adjusting deep maneuver and arrival time, the match condition of swingby is satisfied. To reduce the total mission velocity increments further, a procedure is developed which minimizes total velocity increments for this scheme of transfer trajectory for asteroid Ivar. The trajectory optimization problem is solved with a quasi-Newton algorithm utilizing analytic first derivatives, which are derived from the transversality conditions associated with the optimization formulation and primer vector theory. The simulation results show the scheme for transfer trajectory causes C3 and total velocity increments decrease of 48.80% and 13.20%, respectively.

  10. Phase retrieval via incremental truncated amplitude flow algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Quanbing; Wang, Zhifa; Wang, Linjie; Cheng, Shichao

    2017-10-01

    This paper considers the phase retrieval problem of recovering the unknown signal from the given quadratic measurements. A phase retrieval algorithm based on Incremental Truncated Amplitude Flow (ITAF) which combines the ITWF algorithm and the TAF algorithm is proposed. The proposed ITAF algorithm enhances the initialization by performing both of the truncation methods used in ITWF and TAF respectively, and improves the performance in the gradient stage by applying the incremental method proposed in ITWF to the loop stage of TAF. Moreover, the original sampling vector and measurements are preprocessed before initialization according to the variance of the sensing matrix. Simulation experiments verified the feasibility and validity of the proposed ITAF algorithm. The experimental results show that it can obtain higher success rate and faster convergence speed compared with other algorithms. Especially, for the noiseless random Gaussian signals, ITAF can recover any real-valued signal accurately from the magnitude measurements whose number is about 2.5 times of the signal length, which is close to the theoretic limit (about 2 times of the signal length). And it usually converges to the optimal solution within 20 iterations which is much less than the state-of-the-art algorithms.

  11. EFFECT OF COHERENT STRUCTURES ON ENERGETIC PARTICLE INTENSITY IN THE SOLAR WIND AT 1 AU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tessein, Jeffrey A.; Matthaeus, William H.; Wan, Minping

    2015-10-10

    We present results from an analysis of Advanced Composition Explorer (ACE) observations of energetic particles in the 0.047–4.78 MeV range associated with shocks and discontinuities in the solar wind. Previous work found a strong correlation between coherent structures and energetic particles measured by ACE/EPAM. Coherent structures are identified using the Partial Variance of Increments (PVI) method, which is essentially a normalized vector increment. The correlation was based on a superposed epoch analysis using over 12 years of data. Here, we examine many individual high-PVI events to better understand this association emphasizing intervals selected from data with shock neighborhoods removed. Wemore » find that in many cases the local maximum in PVI is in a region of rising or falling energetic particle intensity, which suggests that magnetic discontinuities may act as barriers inhibiting the motion of energetic particles across them.« less

  12. Determination of the accuracy and operating constants in a digitally biased ring core magnetometer

    USGS Publications Warehouse

    Green, A.W.

    1990-01-01

    By using a very stable voltage reference and a high precision digital-to-analog converter to set bias in digital increments, the inherently high stability and accuracy of a ring core magnetometer can be significantly enhanced. In this case it becomes possible to measure not only variations about the bias level, but to measure the entire value of the field along each magnetometer sensing axis in a nearly absolute sense. To accomplish this, one must accurately determine the value of the digital bias increment for each axis, the zero field offset value for each axis, the scale values, and the transfer coefficients (or nonorthogonality angles) for pairs of axes. This determination can be carried out very simply, using only the Earth's field, a proton magnetometer, and a tripod-mounted fixture which is capable of rotations about two axes that are mutually perpendicular to the Earth's magnetic field vector. ?? 1990.

  13. VectorBase: an updated bioinformatics resource for invertebrate vectors and other organisms related with human diseases

    PubMed Central

    Giraldo-Calderón, Gloria I.; Emrich, Scott J.; MacCallum, Robert M.; Maslen, Gareth; Dialynas, Emmanuel; Topalis, Pantelis; Ho, Nicholas; Gesing, Sandra; Madey, Gregory; Collins, Frank H.; Lawson, Daniel

    2015-01-01

    VectorBase is a National Institute of Allergy and Infectious Diseases supported Bioinformatics Resource Center (BRC) for invertebrate vectors of human pathogens. Now in its 11th year, VectorBase currently hosts the genomes of 35 organisms including a number of non-vectors for comparative analysis. Hosted data range from genome assemblies with annotated gene features, transcript and protein expression data to population genetics including variation and insecticide-resistance phenotypes. Here we describe improvements to our resource and the set of tools available for interrogating and accessing BRC data including the integration of Web Apollo to facilitate community annotation and providing Galaxy to support user-based workflows. VectorBase also actively supports our community through hands-on workshops and online tutorials. All information and data are freely available from our website at https://www.vectorbase.org/. PMID:25510499

  14. Contracting for Agile Software Development in the Department of Defense: An Introduction

    DTIC Science & Technology

    2015-08-01

    Requirements are fixed at a more granular level; reviews of the work product happen more frequently and assess each individual increment rather than a “ big bang ...boundaries than “ big - bang ” development. The implementation of incremental or progressive reviews enables just that—any issues identified at the time of the...the contract needs to support the delivery of deployable software at defined increments/intervals, rather than incentivizing “ big - bang ” efforts or

  15. Cost of Incremental Expansion of an Existing Family Medicine Residency Program.

    PubMed

    Ashkin, Evan A; Newton, Warren P; Toomey, Brian; Lingley, Ronald; Page, Cristen P

    2017-07-01

    Expanding residency training programs to address shortages in the primary care workforce is challenged by the present graduate medical education (GME) environment. The Medicare funding cap on new GME positions and reductions in the Health Resources and Services Administration (HRSA) Teaching Health Center (THC) GME program require innovative solutions to support primary care residency expansion. Sparse literature exists to assist in predicting the actual cost of incremental expansion of a family medicine residency program without federal or state GME support. In 2011 a collaboration to develop a community health center (CHC) academic medical partnership (CHAMP), was formed and created a THC as a training site for expansion of an existing family medicine residency program. The cost of expansion was a critical factor as no Federal GME funding or HRSA THC GME program support was available. Initial start-up costs were supported by a federal grant and local foundations. Careful financial analysis of the expansion has provided actual costs per resident of the incremental expansion of the residencyRESULTS: The CHAMP created a new THC and expanded the residency from eight to ten residents per year. The cost of expansion was approximately $72,000 per resident per year. The cost of incremental expansion of our residency program in the CHAMP model was more than 50% less than that of the recently reported cost of training in the HRSA THC GME program.

  16. Use of Attribute Driven Incremental Discretization and Logic Learning Machine to build a prognostic classifier for neuroblastoma patients.

    PubMed

    Cangelosi, Davide; Muselli, Marco; Parodi, Stefano; Blengio, Fabiola; Becherini, Pamela; Versteeg, Rogier; Conte, Massimo; Varesio, Luigi

    2014-01-01

    Cancer patient's outcome is written, in part, in the gene expression profile of the tumor. We previously identified a 62-probe sets signature (NB-hypo) to identify tissue hypoxia in neuroblastoma tumors and showed that NB-hypo stratified neuroblastoma patients in good and poor outcome 1. It was important to develop a prognostic classifier to cluster patients into risk groups benefiting of defined therapeutic approaches. Novel classification and data discretization approaches can be instrumental for the generation of accurate predictors and robust tools for clinical decision support. We explored the application to gene expression data of Rulex, a novel software suite including the Attribute Driven Incremental Discretization technique for transforming continuous variables into simplified discrete ones and the Logic Learning Machine model for intelligible rule generation. We applied Rulex components to the problem of predicting the outcome of neuroblastoma patients on the bases of 62 probe sets NB-hypo gene expression signature. The resulting classifier consisted in 9 rules utilizing mainly two conditions of the relative expression of 11 probe sets. These rules were very effective predictors, as shown in an independent validation set, demonstrating the validity of the LLM algorithm applied to microarray data and patients' classification. The LLM performed as efficiently as Prediction Analysis of Microarray and Support Vector Machine, and outperformed other learning algorithms such as C4.5. Rulex carried out a feature selection by selecting a new signature (NB-hypo-II) of 11 probe sets that turned out to be the most relevant in predicting outcome among the 62 of the NB-hypo signature. Rules are easily interpretable as they involve only few conditions. Our findings provided evidence that the application of Rulex to the expression values of NB-hypo signature created a set of accurate, high quality, consistent and interpretable rules for the prediction of neuroblastoma patients' outcome. We identified the Rulex weighted classification as a flexible tool that can support clinical decisions. For these reasons, we consider Rulex to be a useful tool for cancer classification from microarray gene expression data.

  17. Gyroscope precession along bound equatorial plane orbits around a Kerr black hole

    NASA Astrophysics Data System (ADS)

    Bini, Donato; Geralico, Andrea; Jantzen, Robert T.

    2016-09-01

    The precession of a test gyroscope along stable bound equatorial plane orbits around a Kerr black hole is analyzed, and the precession angular velocity of the gyro's parallel transported spin vector and the increment in the precession angle after one orbital period is evaluated. The parallel transported Marck frame which enters this discussion is shown to have an elegant geometrical explanation in terms of the electric and magnetic parts of the Killing-Yano 2-form and a Wigner rotation effect.

  18. Characterization of the dengue outbreak in Nuevo Leon state, Mexico, 2010.

    PubMed

    Leduc-Galindo, D; Gloria-Herrera, U; Rincón-Herrera, U; Ramos-Jiménez, J; Garcia-Luna, S; Arellanos-Soto, D; Mendoza-Tavera, N; Tavitas-Aguilar, I; Garcia-Garcia, E; Galindo-Galindo, E; Villarreal-Perez, J; Fernandez-Salas, I; Santiago, G A; Muñoz-Jordan, J; Rivas-Estilla, A M

    2015-04-01

    We studied serotypes circulating dengue virus (DENV) cases, entomological Breteau index, rain-fall index and epidemiology of groups affected during the 2010 outbreak in Nuevo Leon, Mexico. From 2,271 positive cases, 94% were dengue classic and 6% dengue hemorrhagic fever; DENV1 was mainly isolated (99%) (Central-American lineage of American-African-genotype). We found correlation between two environmental phenomena (Increment of rainfall and vector-indexes) (p ≤ 0.05) with epidemiological, clinical and risk of DENV-1 ongoing transmission.

  19. Testing of the Support Vector Machine for Binary-Class Classification

    NASA Technical Reports Server (NTRS)

    Scholten, Matthew

    2011-01-01

    The Support Vector Machine is a powerful algorithm, useful in classifying data in to species. The Support Vector Machines implemented in this research were used as classifiers for the final stage in a Multistage Autonomous Target Recognition system. A single kernel SVM known as SVMlight, and a modified version known as a Support Vector Machine with K-Means Clustering were used. These SVM algorithms were tested as classifiers under varying conditions. Image noise levels varied, and the orientation of the targets changed. The classifiers were then optimized to demonstrate their maximum potential as classifiers. Results demonstrate the reliability of SMV as a method for classification. From trial to trial, SVM produces consistent results

  20. A 400 MHz Wireless Neural Signal Processing IC With 625 $\\times$ On-Chip Data Reduction and Reconfigurable BFSK/QPSK Transmitter Based on Sequential Injection Locking.

    PubMed

    Teng, Kok-Hin; Wu, Tong; Liu, Xiayun; Yang, Zhi; Heng, Chun-Huat

    2017-06-01

    An 8-channel wireless neural signal processing IC, which can perform real-time spike detection, alignment, and feature extraction, and wireless data transmission is proposed. A reconfigurable BFSK/QPSK transmitter (TX) at MICS/MedRadio band is incorporated to support different data rate requirement. By using an Exponential Component-Polynomial Component (EC-PC) spike processing unit with an incremental principal component analysis (IPCA) engine, the detection of neural spikes with poor SNR is possible while achieving 625× data reduction. For the TX, a dual-channel at 401 MHz and 403.8 MHz are supported by applying sequential injection locked techniques while attaining phase noise of -102 dBc/Hz at 100 kHz offset. From the measurement, error vector magnitude (EVM) of 4.60%/9.55% with power amplifier (PA) output power of -15 dBm is achieved for the QPSK at 8 Mbps and the BFSK at 12.5 kbps. Fabricated in 65 nm CMOS with an active area of 1 mm 2 , the design consumes a total current of 5  ∼ 5.6 mA with a maximum energy efficiency of 0.7 nJ/b.

  1. Global Combat Support System Army Increment 1 (GCSS-A Inc 1)

    DTIC Science & Technology

    2016-03-01

    Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY - Fiscal Year...another economic anaylsis was completed on November 14, 2012, in advance of a successful FDD . The program is now in the O&S Phase. GCSS-A Inc 1 2016...Increment I Feb 2011 Aug 2011 Full Deployment Decision ( FDD )1 Feb 2012 Dec 2012 Full Deployment (FD)2 Sep 2017 Mar 2018 Memo 1/ GCSS-A Increment 1

  2. A Determination of Military and Civilian Personnel Costs as Related to a Member of Technical Staff

    DTIC Science & Technology

    1992-06-01

    Costs, 1986 4 2. Direct Total Manpower Bidget Costs, 1992 5 3. Pay Raises 1985-1992 6 4. Support Costs 9 5. Internal Support Personnel 10 6. External...34 Incremental Costs of Military and Civilian Manpower in the Military Services." This docu- ment provides the basis for this section. The report assesses...6 Aug 91. MTS Workyear Cost Comparison. Internal AFSC paper, 20 November 1990. Palmer, Adele R., Osbaldeston, David J., Incremental Costs of Military

  3. Addressing System Reconfiguration and Incremental Integration within IMA Systems

    NASA Astrophysics Data System (ADS)

    Ferrero, F.; Rodríques, A. I.

    2009-05-01

    Recently space industry is paying special attention to Integrated Modular Avionics (IMA) systems due to the benefits that modular concepts could bring to the development of space applications, especially in terms of interoperability, flexibility and software reuse. Two important IMA goals to be highlighted are system reconfiguration, and incremental integration of new functionalities into a pre-existing system. The purpose of this paper is to show how system reconfiguration is conducted based on Allied Standard Avionics Architecture Council (ASAAC) concepts for IMA Systems. Besides, it aims to provide a proposal for addressing the incremental integration concept supported by our experience gained during European Technology Acquisition Program (ETAP) TDP1.7 programme. All these topics will be discussed taking into account safety issues and showing the blueprint as an appropriate technique to support these concepts.

  4. One Size Does Not Fit All: Managing Radical and Incremental Creativity

    ERIC Educational Resources Information Center

    Gilson, Lucy L.; Lim, Hyoun Sook; D'Innocenzo, Lauren; Moye, Neta

    2012-01-01

    This research extends creativity theory by re-conceptualizing creativity as a two-dimensional construct (radical and incremental) and examining the differential effects of intrinsic motivation, extrinsic rewards, and supportive supervision on perceptions of creativity. We hypothesize and find two distinct types of creativity that are associated…

  5. Teaching Programming to Novices: A Review of Approaches and Tools.

    ERIC Educational Resources Information Center

    Brusilovsky, P.; And Others

    Three different approaches to teaching introductory programming are reviewed: the incremental approach, the sub-language approach, and the mini-language approach. The paper analyzes all three approaches, providing a brief history of each and describing an example of a programming environment supporting this approach. In the incremental approach,…

  6. A Systematic Review of the Economic Evidence for Home Support Interventions in Dementia.

    PubMed

    Clarkson, Paul; Davies, Linda; Jasper, Rowan; Loynes, Niklas; Challis, David

    2017-09-01

    Recent evidence signals the need for effective forms of home support to people with dementia and their carers. The cost-effectiveness evidence of different approaches to support is scant. To appraise economic evidence on the cost-effectiveness of home support interventions for dementia to inform future evaluation. A systematic literature review of full and partial economic evaluations was performed using the British National Health Service Economic Evaluation Database supplemented by additional references. Study characteristics and findings, including incremental cost-effectiveness ratios, when available, were summarized narratively. Study quality was appraised using the National Health Service Economic Evaluation Database critical appraisal criteria and independent ratings, agreed by two reviewers. Studies were located on a permutation matrix describing their mix of incremental costs/effects to aid decision making. Of the 151 articles retrieved, 14 studies met the inclusion criteria: 8 concerning support to people with dementia and 6 to carers. Five studies were incremental cost-utility analyses, seven were cost-effectiveness analyses, and two were cost consequences analyses. Five studies expressed incremental cost-effectiveness ratios as cost per quality-adjusted life-year (£6,696-£207,942 per quality-adjusted life-year). In four studies, interventions were dominant over usual care. Two interventions were more costly but more beneficial and were favorable against current acceptability thresholds. Occupational therapy, home-based exercise, and a carers' coping intervention emerged as cost-effective approaches for which there was better evidence. These interventions used environmental modifications, behavior management, physical activity, and emotional support as active components. More robust evidence is needed to judge the value of these and other interventions across the dementia care pathway. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Multiclass Reduced-Set Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Tang, Benyang; Mazzoni, Dominic

    2006-01-01

    There are well-established methods for reducing the number of support vectors in a trained binary support vector machine, often with minimal impact on accuracy. We show how reduced-set methods can be applied to multiclass SVMs made up of several binary SVMs, with significantly better results than reducing each binary SVM independently. Our approach is based on Burges' approach that constructs each reduced-set vector as the pre-image of a vector in kernel space, but we extend this by recomputing the SVM weights and bias optimally using the original SVM objective function. This leads to greater accuracy for a binary reduced-set SVM, and also allows vectors to be 'shared' between multiple binary SVMs for greater multiclass accuracy with fewer reduced-set vectors. We also propose computing pre-images using differential evolution, which we have found to be more robust than gradient descent alone. We show experimental results on a variety of problems and find that this new approach is consistently better than previous multiclass reduced-set methods, sometimes with a dramatic difference.

  8. A Subdivision-Based Representation for Vector Image Editing.

    PubMed

    Liao, Zicheng; Hoppe, Hugues; Forsyth, David; Yu, Yizhou

    2012-11-01

    Vector graphics has been employed in a wide variety of applications due to its scalability and editability. Editability is a high priority for artists and designers who wish to produce vector-based graphical content with user interaction. In this paper, we introduce a new vector image representation based on piecewise smooth subdivision surfaces, which is a simple, unified and flexible framework that supports a variety of operations, including shape editing, color editing, image stylization, and vector image processing. These operations effectively create novel vector graphics by reusing and altering existing image vectorization results. Because image vectorization yields an abstraction of the original raster image, controlling the level of detail of this abstraction is highly desirable. To this end, we design a feature-oriented vector image pyramid that offers multiple levels of abstraction simultaneously. Our new vector image representation can be rasterized efficiently using GPU-accelerated subdivision. Experiments indicate that our vector image representation achieves high visual quality and better supports editing operations than existing representations.

  9. Chemical data visualization and analysis with incremental generative topographic mapping: big data challenge.

    PubMed

    Gaspar, Héléna A; Baskin, Igor I; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2015-01-26

    This paper is devoted to the analysis and visualization in 2-dimensional space of large data sets of millions of compounds using the incremental version of generative topographic mapping (iGTM). The iGTM algorithm implemented in the in-house ISIDA-GTM program was applied to a database of more than 2 million compounds combining data sets of 36 chemicals suppliers and the NCI collection, encoded either by MOE descriptors or by MACCS keys. Taking advantage of the probabilistic nature of GTM, several approaches to data analysis were proposed. The chemical space coverage was evaluated using the normalized Shannon entropy. Different views of the data (property landscapes) were obtained by mapping various physical and chemical properties (molecular weight, aqueous solubility, LogP, etc.) onto the iGTM map. The superposition of these views helped to identify the regions in the chemical space populated by compounds with desirable physicochemical profiles and the suppliers providing them. The data sets similarity in the latent space was assessed by applying several metrics (Euclidean distance, Tanimoto and Bhattacharyya coefficients) to data probability distributions based on cumulated responsibility vectors. As a complementary approach, data sets were compared by considering them as individual objects on a meta-GTM map, built on cumulated responsibility vectors or property landscapes produced with iGTM. We believe that the iGTM methodology described in this article represents a fast and reliable way to analyze and visualize large chemical databases.

  10. International Space Station Increment-2 Quick Look Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric

    2001-01-01

    The objective of this quick look report is to disseminate the International Space Station (ISS) Increment-2 reduced gravity environment preliminary analysis in a timely manner to the microgravity scientific community. This report is a quick look at the processed acceleration data collected by the Microgravity Acceleration Measurement System (MAMS) during the period of May 3 to June 8, 2001. The report is by no means an exhaustive examination of all the relevant activities, which occurred during the time span mentioned above for two reasons. First, the time span being considered in this report is rather short since the MAMS was not active throughout the time span being considered to allow a detailed characterization. Second, as the name of the report implied, it is a quick look at the acceleration data. Consequently, a more comprehensive report, the ISS Increment-2 report, will be published following the conclusion of the Increment-2 tour of duty. NASA sponsors the MAMS and the Space Acceleration Microgravity System (SAMS) to support microgravity science experiments, which require microgravity acceleration measurements. On April 19, 2001, both the MAMS and the SAMS units were launched on STS-100 from the Kennedy Space Center for installation on the ISS. The MAMS unit was flown to the station in support of science experiments requiring quasisteady acceleration data measurements, while the SAMS unit was flown to support experiments requiring vibratory acceleration data measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification. The ISS reduced gravity environment analysis presented in this report uses mostly the MAMS acceleration data measurements (the Increment-2 report will cover both systems). The MAMS has two sensors. The MAMS Orbital Acceleration Research Experiment Sensor Subsystem, which is a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle. The MAMS High Resolution Acceleration Package is used to characterize the ISS vibratory environment up to 100 Hz. This quick look report presents some selected quasi-steady and vibratory activities recorded by the MAMS during the ongoing ISS Increment-2 tour of duty.

  11. Insect cell transformation vectors that support high level expression and promoter assessment in insect cell culture

    USDA-ARS?s Scientific Manuscript database

    A somatic transformation vector, pDP9, was constructed that provides a simplified means of producing permanently transformed cultured insect cells that support high levels of protein expression of foreign genes. The pDP9 plasmid vector incorporates DNA sequences from the Junonia coenia densovirus th...

  12. Fourier transform infrared spectroscopy microscopic imaging classification based on spatial-spectral features

    NASA Astrophysics Data System (ADS)

    Liu, Lian; Yang, Xiukun; Zhong, Mingliang; Liu, Yao; Jing, Xiaojun; Yang, Qin

    2018-04-01

    The discrete fractional Brownian incremental random (DFBIR) field is used to describe the irregular, random, and highly complex shapes of natural objects such as coastlines and biological tissues, for which traditional Euclidean geometry cannot be used. In this paper, an anisotropic variable window (AVW) directional operator based on the DFBIR field model is proposed for extracting spatial characteristics of Fourier transform infrared spectroscopy (FTIR) microscopic imaging. Probabilistic principal component analysis first extracts spectral features, and then the spatial features of the proposed AVW directional operator are combined with the former to construct a spatial-spectral structure, which increases feature-related information and helps a support vector machine classifier to obtain more efficient distribution-related information. Compared to Haralick’s grey-level co-occurrence matrix, Gabor filters, and local binary patterns (e.g. uniform LBPs, rotation-invariant LBPs, uniform rotation-invariant LBPs), experiments on three FTIR spectroscopy microscopic imaging datasets show that the proposed AVW directional operator is more advantageous in terms of classification accuracy, particularly for low-dimensional spaces of spatial characteristics.

  13. Role of Self-Efficacy in Rehabilitation Outcome among Chronic Low Back Pain Patients.

    ERIC Educational Resources Information Center

    Altmaier, Elizabeth M.; And Others

    1993-01-01

    Examined role of self-efficacy beliefs in rehabilitation of 45 low back pain patients participating in 3-week rehabilitation program. Increments in self-efficacy beliefs during program were not associated with improved patient functioning at discharge. However, in support of theorized role of self-efficacy in behavior change, increments in…

  14. The Trait Emotional Intelligence Questionnaire: Internal Structure, Convergent, Criterion, and Incremental Validity in an Italian Sample

    ERIC Educational Resources Information Center

    Andrei, Federica; Smith, Martin M.; Surcinelli, Paola; Baldaro, Bruno; Saklofske, Donald H.

    2016-01-01

    This study investigated the structure and validity of the Italian translation of the Trait Emotional Intelligence Questionnaire. Data were self-reported from 227 participants. Confirmatory factor analysis supported the four-factor structure of the scale. Hierarchical regressions also demonstrated its incremental validity beyond demographics, the…

  15. [Support vector machine?assisted diagnosis of human malignant gastric tissues based on dielectric properties].

    PubMed

    Zhang, Sa; Li, Zhou; Xin, Xue-Gang

    2017-12-20

    To achieve differential diagnosis of normal and malignant gastric tissues based on discrepancies in their dielectric properties using support vector machine. The dielectric properties of normal and malignant gastric tissues at the frequency ranging from 42.58 to 500 MHz were measured by coaxial probe method, and the Cole?Cole model was used to fit the measured data. Receiver?operating characteristic (ROC) curve analysis was used to evaluate the discrimination capability with respect to permittivity, conductivity, and Cole?Cole fitting parameters. Support vector machine was used for discriminating normal and malignant gastric tissues, and the discrimination accuracy was calculated using k?fold cross? The area under the ROC curve was above 0.8 for permittivity at the 5 frequencies at the lower end of the measured frequency range. The combination of the support vector machine with the permittivity at all these 5 frequencies combined achieved the highest discrimination accuracy of 84.38% with a MATLAB runtime of 3.40 s. The support vector machine?assisted diagnosis is feasible for human malignant gastric tissues based on the dielectric properties.

  16. Research on intrusion detection based on Kohonen network and support vector machine

    NASA Astrophysics Data System (ADS)

    Shuai, Chunyan; Yang, Hengcheng; Gong, Zeweiyi

    2018-05-01

    In view of the problem of low detection accuracy and the long detection time of support vector machine, which directly applied to the network intrusion detection system. Optimization of SVM parameters can greatly improve the detection accuracy, but it can not be applied to high-speed network because of the long detection time. a method based on Kohonen neural network feature selection is proposed to reduce the optimization time of support vector machine parameters. Firstly, this paper is to calculate the weights of the KDD99 network intrusion data by Kohonen network and select feature by weight. Then, after the feature selection is completed, genetic algorithm (GA) and grid search method are used for parameter optimization to find the appropriate parameters and classify them by support vector machines. By comparing experiments, it is concluded that feature selection can reduce the time of parameter optimization, which has little influence on the accuracy of classification. The experiments suggest that the support vector machine can be used in the network intrusion detection system and reduce the missing rate.

  17. Reducing Memory Cost of Exact Diagonalization using Singular Value Decomposition

    NASA Astrophysics Data System (ADS)

    Weinstein, Marvin; Chandra, Ravi; Auerbach, Assa

    2012-02-01

    We present a modified Lanczos algorithm to diagonalize lattice Hamiltonians with dramatically reduced memory requirements. In contrast to variational approaches and most implementations of DMRG, Lanczos rotations towards the ground state do not involve incremental minimizations, (e.g. sweeping procedures) which may get stuck in false local minima. The lattice of size N is partitioned into two subclusters. At each iteration the rotating Lanczos vector is compressed into two sets of nsvd small subcluster vectors using singular value decomposition. For low entanglement entropy See, (satisfied by short range Hamiltonians), the truncation error is bounded by (-nsvd^1/See). Convergence is tested for the Heisenberg model on Kagom'e clusters of 24, 30 and 36 sites, with no lattice symmetries exploited, using less than 15GB of dynamical memory. Generalization of the Lanczos-SVD algorithm to multiple partitioning is discussed, and comparisons to other techniques are given. Reference: arXiv:1105.0007

  18. A Power Transformers Fault Diagnosis Model Based on Three DGA Ratios and PSO Optimization SVM

    NASA Astrophysics Data System (ADS)

    Ma, Hongzhe; Zhang, Wei; Wu, Rongrong; Yang, Chunyan

    2018-03-01

    In order to make up for the shortcomings of existing transformer fault diagnosis methods in dissolved gas-in-oil analysis (DGA) feature selection and parameter optimization, a transformer fault diagnosis model based on the three DGA ratios and particle swarm optimization (PSO) optimize support vector machine (SVM) is proposed. Using transforming support vector machine to the nonlinear and multi-classification SVM, establishing the particle swarm optimization to optimize the SVM multi classification model, and conducting transformer fault diagnosis combined with the cross validation principle. The fault diagnosis results show that the average accuracy of test method is better than the standard support vector machine and genetic algorithm support vector machine, and the proposed method can effectively improve the accuracy of transformer fault diagnosis is proved.

  19. The Design of a Templated C++ Small Vector Class for Numerical Computing

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.

    2000-01-01

    We describe the design and implementation of a templated C++ class for vectors. The vector class is templated both for vector length and vector component type; the vector length is fixed at template instantiation time. The vector implementation is such that for a vector of N components of type T, the total number of bytes required by the vector is equal to N * size of (T), where size of is the built-in C operator. The property of having a size no bigger than that required by the components themselves is key in many numerical computing applications, where one may allocate very large arrays of small, fixed-length vectors. In addition to the design trade-offs motivating our fixed-length vector design choice, we review some of the C++ template features essential to an efficient, succinct implementation. In particular, we highlight some of the standard C++ features, such as partial template specialization, that are not supported by all compilers currently. This report provides an inventory listing the relevant support currently provided by some key compilers, as well as test code one can use to verify compiler capabilities.

  20. An Efficient Wait-Free Vector

    DOE PAGES

    Feldman, Steven; Valera-Leon, Carlos; Dechev, Damian

    2016-03-01

    The vector is a fundamental data structure, which provides constant-time access to a dynamically-resizable range of elements. Currently, there exist no wait-free vectors. The only non-blocking version supports only a subset of the sequential vector API and exhibits significant synchronization overhead caused by supporting opposing operations. Since many applications operate in phases of execution, wherein each phase only a subset of operations are used, this overhead is unnecessary for the majority of the application. To address the limitations of the non-blocking version, we present a new design that is wait-free, supports more of the operations provided by the sequential vector,more » and provides alternative implementations of key operations. These alternatives allow the developer to balance the performance and functionality of the vector as requirements change throughout execution. Compared to the known non-blocking version and the concurrent vector found in Intel’s TBB library, our design outperforms or provides comparable performance in the majority of tested scenarios. Over all tested scenarios, the presented design performs an average of 4.97 times more operations per second than the non-blocking vector and 1.54 more than the TBB vector. In a scenario designed to simulate the filling of a vector, performance improvement increases to 13.38 and 1.16 times. This work presents the first ABA-free non-blocking vector. Finally, unlike the other non-blocking approach, all operations are wait-free and bounds-checked and elements are stored contiguously in memory.« less

  1. The incremental impact of cardiac MRI on clinical decision-making.

    PubMed

    Rajwani, Adil; Stewart, Michael J; Richardson, James D; Child, Nicholas M; Maredia, Neil

    2016-01-01

    Despite a significant expansion in the use of cardiac MRI (CMR), there is inadequate evaluation of its incremental impact on clinical decision-making over and above other well-established modalities. We sought to determine the incremental utility of CMR in routine practice. 629 consecutive CMR studies referred by 44 clinicians from 9 institutions were evaluated. Pre-defined algorithms were used to determine the incremental influence on diagnostic thinking, influence on clinical management and thus the overall clinical utility. Studies were also subdivided and evaluated according to the indication for CMR. CMR provided incremental information to the clinician in 85% of cases, with incremental influence on diagnostic thinking in 85% of cases and incremental impact on management in 42% of cases. The overall incremental utility of CMR exceeded 90% in 7 out of the 13 indications, whereas in settings such as the evaluation of unexplained ventricular arrhythmia or mild left ventricular systolic dysfunction, this was <50%. CMR was frequently able to inform and influence decision-making in routine clinical practice, even with analyses that accepted only incremental clinical information and excluded a redundant duplication of imaging. Significant variations in yield were noted according to the indication for CMR. These data support a wider integration of CMR services into cardiac imaging departments. These data are the first to objectively evaluate the incremental value of a UK CMR service in clinical decision-making. Such data are essential when seeking justification for a CMR service.

  2. International Space Station Increment-2 Microgravity Environment Summary Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy

    2002-01-01

    This summary report presents the results of some of the processed acceleration data, collected aboard the International Space Station during the period of May to August 2001, the Increment-2 phase of the station. Two accelerometer systems were used to measure the acceleration levels during activities that took place during the Increment-2 segment. However, not all of the activities were analyzed for this report due to time constraints, lack of precise information regarding some payload operations and other station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments, which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of vehicle microgravity requirements verification. The International Space Station Increment-2 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and the vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 300 Hz. This summary report presents analysis of some selected quasisteady and vibratory activities measured by these accelerometers during Increment-2 from May to August 20, 2001.

  3. Tax Increment Financing and Education Expenditures: The Case of Iowa

    ERIC Educational Resources Information Center

    Nguyen-Hoang, Phuong

    2014-01-01

    This is the first study to directly examine the relationship between tax increment financing (TIF) and education expenditures, using the state of Iowa as a case study. I find that greater use of TIF is associated with reduced education expenditures. I also find little evidence to support the commonly held proposition that school spending increases…

  4. Defense Agencies Initiative Increment 2 (DAI Inc 2)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Defense Agencies Initiative Increment 2 (DAI Inc 2) Defense Acquisition Management...Automated Information System MAIS OE - MAIS Original Estimate MAR – MAIS Annual Report MDA - Milestone Decision Authority MDD - Materiel Development...management systems supporting diverse operational functions and the warfighter in decision making and financial reporting . These disparate, non

  5. Prediction and analysis of essential genes using the enrichments of gene ontology and KEGG pathways.

    PubMed

    Chen, Lei; Zhang, Yu-Hang; Wang, ShaoPeng; Zhang, YunHua; Huang, Tao; Cai, Yu-Dong

    2017-01-01

    Identifying essential genes in a given organism is important for research on their fundamental roles in organism survival. Furthermore, if possible, uncovering the links between core functions or pathways with these essential genes will further help us obtain deep insight into the key roles of these genes. In this study, we investigated the essential and non-essential genes reported in a previous study and extracted gene ontology (GO) terms and biological pathways that are important for the determination of essential genes. Through the enrichment theory of GO and KEGG pathways, we encoded each essential/non-essential gene into a vector in which each component represented the relationship between the gene and one GO term or KEGG pathway. To analyze these relationships, the maximum relevance minimum redundancy (mRMR) was adopted. Then, the incremental feature selection (IFS) and support vector machine (SVM) were employed to extract important GO terms and KEGG pathways. A prediction model was built simultaneously using the extracted GO terms and KEGG pathways, which yielded nearly perfect performance, with a Matthews correlation coefficient of 0.951, for distinguishing essential and non-essential genes. To fully investigate the key factors influencing the fundamental roles of essential genes, the 21 most important GO terms and three KEGG pathways were analyzed in detail. In addition, several genes was provided in this study, which were predicted to be essential genes by our prediction model. We suggest that this study provides more functional and pathway information on the essential genes and provides a new way to investigate related problems.

  6. What Is Engagement? Proactivity as the Missing Link in the HEXACO Model of Personality.

    PubMed

    de Vries, Reinout E; Wawoe, Kilian W; Holtrop, Djurre

    2016-04-01

    We tested the hypothesis that proactivity represents the engagement vector in the HEXACO model of personality. Questionnaire data were obtained in five studies, three of which consisted (mostly) of students: Study 1 (N = 188, Mage  = 20.0, 89.4% women), Study 3 (N = 315, Mage  = 20.4, 80.6% women), and Study 4 (N = 309 self-ratings, Mage  = 20.0, 78.3% women; N = 307 other-ratings, Mage  = 24.5, 62.2% women). Participants in the other two studies came from an ISO-certified representative community panel: Study 2 (N = 525, Mage  = 51.2, 52.0% women) and Study 5 (N = 736, Mage  = 42.2, 48.0% women). Proactive Personality and Proactivity were positively related to Extraversion, Conscientiousness, and Openness to Experience, but only weakly related or unrelated to Honesty-Humility, Emotionality, and Agreeableness, supporting the alignment of Proactive Personality/Proactivity with the hypothesized HEXACO engagement vector. Additionally, Proactivity explained incremental variance in self-rated job performance on top of the HEXACO facets that were most closely associated with Proactive Personality/Proactivity, that is, Social Boldness (an Extraversion facet), Diligence (a Conscientiousness facet), and Creativity (an Openness to Experience facet), but not in entrepreneurship and intrapreneurship. Proactivity is the missing engagement link in the HEXACO model of personality. The results are discussed in light of higher-order factors (e.g., general factor of personality and Alpha and Beta) of personality and bandwidth-fidelity controversies. © 2014 Wiley Periodicals, Inc.

  7. Soft-sensing model of temperature for aluminum reduction cell on improved twin support vector regression

    NASA Astrophysics Data System (ADS)

    Li, Tao

    2018-06-01

    The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.

  8. Novel Nonreplicating Vaccinia Virus Vector Enhances Expression of Heterologous Genes and Suppresses Synthesis of Endogenous Viral Proteins.

    PubMed

    Wyatt, Linda S; Xiao, Wei; Americo, Jeffrey L; Earl, Patricia L; Moss, Bernard

    2017-06-06

    Viruses are used as expression vectors for protein synthesis, immunology research, vaccines, and therapeutics. Advantages of poxvirus vectors include the accommodation of large amounts of heterologous DNA, the presence of a cytoplasmic site of transcription, and high expression levels. On the other hand, competition of approximately 200 viral genes with the target gene for expression and immune recognition may be disadvantageous. We describe a vaccinia virus (VACV) vector that uses an early promoter to express the bacteriophage T7 RNA polymerase; has the A23R intermediate transcription factor gene deleted, thereby restricting virus replication to complementing cells; and has a heterologous gene regulated by a T7 promoter. In noncomplementing cells, viral early gene expression and DNA replication occurred normally but synthesis of intermediate and late proteins was prevented. Nevertheless, the progeny viral DNA provided templates for abundant expression of heterologous genes regulated by a T7 promoter. Selective expression of the Escherichia coli lac repressor gene from an intermediate promoter reduced transcription of the heterologous gene specifically in complementing cells, where large amounts might adversely impact VACV replication. Expression of heterologous proteins mediated by the A23R deletion vector equaled that of a replicating VACV, was higher than that of a nonreplicating modified vaccinia virus Ankara (MVA) vector used for candidate vaccines in vitro and in vivo , and was similarly immunogenic in mice. Unlike the MVA vector, the A23R deletion vector still expresses numerous early genes that can restrict immunogenicity as demonstrated here by the failure of the prototype vector to induce interferon alpha. By deleting immunomodulatory genes, we anticipate further improvements in the system. IMPORTANCE Vaccines provide an efficient and effective way of preventing infectious diseases. Nevertheless, new and better vaccines are needed. Vaccinia virus, which was used successfully as a live vaccine to eradicate smallpox, has been further attenuated and adapted as a recombinant vector for immunization against other pathogens. However, since the initial description of this vector system, only incremental improvements largely related to safety have been implemented. Here we described novel modifications of the platform that increased expression of the heterologous target gene and decreased expression of endogenous vaccinia virus genes while providing safety by preventing replication of the candidate vaccine except in complementing cells used for vector propagation. Copyright © 2017 Wyatt et al.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, Steven; Valera-Leon, Carlos; Dechev, Damian

    The vector is a fundamental data structure, which provides constant-time access to a dynamically-resizable range of elements. Currently, there exist no wait-free vectors. The only non-blocking version supports only a subset of the sequential vector API and exhibits significant synchronization overhead caused by supporting opposing operations. Since many applications operate in phases of execution, wherein each phase only a subset of operations are used, this overhead is unnecessary for the majority of the application. To address the limitations of the non-blocking version, we present a new design that is wait-free, supports more of the operations provided by the sequential vector,more » and provides alternative implementations of key operations. These alternatives allow the developer to balance the performance and functionality of the vector as requirements change throughout execution. Compared to the known non-blocking version and the concurrent vector found in Intel’s TBB library, our design outperforms or provides comparable performance in the majority of tested scenarios. Over all tested scenarios, the presented design performs an average of 4.97 times more operations per second than the non-blocking vector and 1.54 more than the TBB vector. In a scenario designed to simulate the filling of a vector, performance improvement increases to 13.38 and 1.16 times. This work presents the first ABA-free non-blocking vector. Finally, unlike the other non-blocking approach, all operations are wait-free and bounds-checked and elements are stored contiguously in memory.« less

  10. A comparative study of surface EMG classification by fuzzy relevance vector machine and fuzzy support vector machine.

    PubMed

    Xie, Hong-Bo; Huang, Hu; Wu, Jianhua; Liu, Lei

    2015-02-01

    We present a multiclass fuzzy relevance vector machine (FRVM) learning mechanism and evaluate its performance to classify multiple hand motions using surface electromyographic (sEMG) signals. The relevance vector machine (RVM) is a sparse Bayesian kernel method which avoids some limitations of the support vector machine (SVM). However, RVM still suffers the difficulty of possible unclassifiable regions in multiclass problems. We propose two fuzzy membership function-based FRVM algorithms to solve such problems, based on experiments conducted on seven healthy subjects and two amputees with six hand motions. Two feature sets, namely, AR model coefficients and room mean square value (AR-RMS), and wavelet transform (WT) features, are extracted from the recorded sEMG signals. Fuzzy support vector machine (FSVM) analysis was also conducted for wide comparison in terms of accuracy, sparsity, training and testing time, as well as the effect of training sample sizes. FRVM yielded comparable classification accuracy with dramatically fewer support vectors in comparison with FSVM. Furthermore, the processing delay of FRVM was much less than that of FSVM, whilst training time of FSVM much faster than FRVM. The results indicate that FRVM classifier trained using sufficient samples can achieve comparable generalization capability as FSVM with significant sparsity in multi-channel sEMG classification, which is more suitable for sEMG-based real-time control applications.

  11. Impact of nutrition support on clinical outcome and cost-effectiveness analysis in patients at nutritional risk: A prospective cohort study with propensity score matching.

    PubMed

    Zhang, Hui; Wang, Yang; Jiang, Zhu-Ming; Kondrup, Jens; Fang, Hai; Andrews, Martha; Nolan, Marie T; Mu, Shao-Yu; Zhang, Jun; Yu, Kang; Lu, Qian; Kang, Wei-Ming

    2017-05-01

    There is a lack of evidence regarding the economic effects of nutrition support in patients at nutritional risk. The aim of this study was to perform a cost-effectiveness analysis by comparing an adequate nutrition support cohort with a no-support cohort. A prospective observational study was performed in the surgical and medical gastroenterology wards. We identified patients at nutritional risk and the provision of nutrition support by the staff, unaware of the risk status, was recorded. Cost data were obtained from each patient's statement of accounts, and effectiveness was measured by the rate of infectious complication. To control for potential confounding variables, the propensity score method with matching was carried out. The incremental cost-effectiveness ratio was calculated based on the matched population. We screened 3791 patients, and 440 were recruited for the analysis. Patients in the nutrition support cohort had a lower incidence of infectious complications than those in the no-support cohort (9.1 versus 18.1%; P = 0.007). This result was similar in the 149 propensity matched pairs (9.4 versus 24.2%; P < 0.001). The median hospital length of stay was significantly reduced among the matched nutrition support patients (13 versus 15 d; P < 0.001). The total costs were similar among the matched pairs (US $6219 versus $6161). The incremental cost-effectiveness analysis suggested that nutrition support cost US $392 per patient prevented from having infectious complications. Nutrition support was associated with fewer infectious complications and shorter length of stay in patients at nutritional risk. The incremental cost-effectiveness ratio indicated that nutrition support had not increased costs significantly. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. A new method for the prediction of chatter stability lobes based on dynamic cutting force simulation model and support vector machine

    NASA Astrophysics Data System (ADS)

    Peng, Chong; Wang, Lun; Liao, T. Warren

    2015-10-01

    Currently, chatter has become the critical factor in hindering machining quality and productivity in machining processes. To avoid cutting chatter, a new method based on dynamic cutting force simulation model and support vector machine (SVM) is presented for the prediction of chatter stability lobes. The cutting force is selected as the monitoring signal, and the wavelet energy entropy theory is used to extract the feature vectors. A support vector machine is constructed using the MATLAB LIBSVM toolbox for pattern classification based on the feature vectors derived from the experimental cutting data. Then combining with the dynamic cutting force simulation model, the stability lobes diagram (SLD) can be estimated. Finally, the predicted results are compared with existing methods such as zero-order analytical (ZOA) and semi-discretization (SD) method as well as actual cutting experimental results to confirm the validity of this new method.

  13. A Software Package for Neural Network Applications Development

    NASA Technical Reports Server (NTRS)

    Baran, Robert H.

    1993-01-01

    Original Backprop (Version 1.2) is an MS-DOS package of four stand-alone C-language programs that enable users to develop neural network solutions to a variety of practical problems. Original Backprop generates three-layer, feed-forward (series-coupled) networks which map fixed-length input vectors into fixed length output vectors through an intermediate (hidden) layer of binary threshold units. Version 1.2 can handle up to 200 input vectors at a time, each having up to 128 real-valued components. The first subprogram, TSET, appends a number (up to 16) of classification bits to each input, thus creating a training set of input output pairs. The second subprogram, BACKPROP, creates a trilayer network to do the prescribed mapping and modifies the weights of its connections incrementally until the training set is leaned. The learning algorithm is the 'back-propagating error correction procedures first described by F. Rosenblatt in 1961. The third subprogram, VIEWNET, lets the trained network be examined, tested, and 'pruned' (by the deletion of unnecessary hidden units). The fourth subprogram, DONET, makes a TSR routine by which the finished product of the neural net design-and-training exercise can be consulted under other MS-DOS applications.

  14. International Space Station Increment-6/8 Microgravity Environment Summary Report November 2002 to April 2004

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; Reckart, Timothy

    2006-01-01

    This summary report presents the analysis results of some of the processed acceleration data measured aboard the International Space Station during the period of November 2002 to April 2004. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-6/8. However, not all of the activities during that period were analyzed in order to keep the size of the report manageable. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System to support microgravity science experiments that require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification as well as in support of the International Space Station support cadre. The International Space Station Increment-6/8 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1. The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2. The Space Acceleration Measurement System measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-6/8 from November 2002 to April 2004.

  15. Community detection in complex networks using proximate support vector clustering

    NASA Astrophysics Data System (ADS)

    Wang, Feifan; Zhang, Baihai; Chai, Senchun; Xia, Yuanqing

    2018-03-01

    Community structure, one of the most attention attracting properties in complex networks, has been a cornerstone in advances of various scientific branches. A number of tools have been involved in recent studies concentrating on the community detection algorithms. In this paper, we propose a support vector clustering method based on a proximity graph, owing to which the introduced algorithm surpasses the traditional support vector approach both in accuracy and complexity. Results of extensive experiments undertaken on computer generated networks and real world data sets illustrate competent performances in comparison with the other counterparts.

  16. A Wavelet Support Vector Machine Combination Model for Singapore Tourist Arrival to Malaysia

    NASA Astrophysics Data System (ADS)

    Rafidah, A.; Shabri, Ani; Nurulhuda, A.; Suhaila, Y.

    2017-08-01

    In this study, wavelet support vector machine model (WSVM) is proposed and applied for monthly data Singapore tourist time series prediction. The WSVM model is combination between wavelet analysis and support vector machine (SVM). In this study, we have two parts, first part we compare between the kernel function and second part we compare between the developed models with single model, SVM. The result showed that kernel function linear better than RBF while WSVM outperform with single model SVM to forecast monthly Singapore tourist arrival to Malaysia.

  17. Support System for Solar Receivers

    NASA Technical Reports Server (NTRS)

    Kiceniuk, T.

    1985-01-01

    Hinged split-ring mounts insure safe support of heavy receivers. In addition to safer operation and damage-free mounting system provides more accurate focusing, and small incremental adjustments of ring more easily made.

  18. Predicting primary progressive aphasias with support vector machine approaches in structural MRI data.

    PubMed

    Bisenius, Sandrine; Mueller, Karsten; Diehl-Schmid, Janine; Fassbender, Klaus; Grimmer, Timo; Jessen, Frank; Kassubek, Jan; Kornhuber, Johannes; Landwehrmeyer, Bernhard; Ludolph, Albert; Schneider, Anja; Anderl-Straub, Sarah; Stuke, Katharina; Danek, Adrian; Otto, Markus; Schroeter, Matthias L

    2017-01-01

    Primary progressive aphasia (PPA) encompasses the three subtypes nonfluent/agrammatic variant PPA, semantic variant PPA, and the logopenic variant PPA, which are characterized by distinct patterns of language difficulties and regional brain atrophy. To validate the potential of structural magnetic resonance imaging data for early individual diagnosis, we used support vector machine classification on grey matter density maps obtained by voxel-based morphometry analysis to discriminate PPA subtypes (44 patients: 16 nonfluent/agrammatic variant PPA, 17 semantic variant PPA, 11 logopenic variant PPA) from 20 healthy controls (matched for sample size, age, and gender) in the cohort of the multi-center study of the German consortium for frontotemporal lobar degeneration. Here, we compared a whole-brain with a meta-analysis-based disease-specific regions-of-interest approach for support vector machine classification. We also used support vector machine classification to discriminate the three PPA subtypes from each other. Whole brain support vector machine classification enabled a very high accuracy between 91 and 97% for identifying specific PPA subtypes vs. healthy controls, and 78/95% for the discrimination between semantic variant vs. nonfluent/agrammatic or logopenic PPA variants. Only for the discrimination between nonfluent/agrammatic and logopenic PPA variants accuracy was low with 55%. Interestingly, the regions that contributed the most to the support vector machine classification of patients corresponded largely to the regions that were atrophic in these patients as revealed by group comparisons. Although the whole brain approach took also into account regions that were not covered in the regions-of-interest approach, both approaches showed similar accuracies due to the disease-specificity of the selected networks. Conclusion, support vector machine classification of multi-center structural magnetic resonance imaging data enables prediction of PPA subtypes with a very high accuracy paving the road for its application in clinical settings.

  19. A Fast Reduced Kernel Extreme Learning Machine.

    PubMed

    Deng, Wan-Yu; Ong, Yew-Soon; Zheng, Qing-Hua

    2016-04-01

    In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Support vector machines

    NASA Technical Reports Server (NTRS)

    Garay, Michael J.; Mazzoni, Dominic; Davies, Roger; Wagstaff, Kiri

    2004-01-01

    Support Vector Machines (SVMs) are a type of supervised learning algorith,, other examples of which are Artificial Neural Networks (ANNs), Decision Trees, and Naive Bayesian Classifiers. Supervised learning algorithms are used to classify objects labled by a 'supervisor' - typically a human 'expert.'.

  1. Lysine acetylation sites prediction using an ensemble of support vector machine classifiers.

    PubMed

    Xu, Yan; Wang, Xiao-Bo; Ding, Jun; Wu, Ling-Yun; Deng, Nai-Yang

    2010-05-07

    Lysine acetylation is an essentially reversible and high regulated post-translational modification which regulates diverse protein properties. Experimental identification of acetylation sites is laborious and expensive. Hence, there is significant interest in the development of computational methods for reliable prediction of acetylation sites from amino acid sequences. In this paper we use an ensemble of support vector machine classifiers to perform this work. The experimentally determined acetylation lysine sites are extracted from Swiss-Prot database and scientific literatures. Experiment results show that an ensemble of support vector machine classifiers outperforms single support vector machine classifier and other computational methods such as PAIL and LysAcet on the problem of predicting acetylation lysine sites. The resulting method has been implemented in EnsemblePail, a web server for lysine acetylation sites prediction available at http://www.aporc.org/EnsemblePail/. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  2. Path planning for assembly of strut-based structures. Thesis

    NASA Technical Reports Server (NTRS)

    Muenger, Rolf

    1991-01-01

    A path planning method with collision avoidance for a general single chain nonredundant or redundant robot is proposed. Joint range boundary overruns are also avoided. The result is a sequence of joint vectors which are passed to a trajectory planner. A potential field algorithm in joint space computes incremental joint vectors delta-q = delta-q(sub a) + delta-q(sub c) + delta-q(sub r). Adding delta-q to the robot's current joint vector leads to the next step in the path. Delta-q(sub a) is obtained by computing the minimum norm solution of the underdetermined linear system J delta-q(sub a) = x(sub a) where x(sub a) is a translational and rotational force vector that attracts the robot to its goal position and orientation. J is the manipulator Jacobian. Delta-q(sub c) is a collision avoidance term encompassing collisions between the robot (links and payload) and obstacles in the environment as well as collisions among links and payload of the robot themselves. It is obtained in joint space directly. Delta-q(sub r) is a function of the current joint vector and avoids joint range overruns. A higher level discrete search over candidate safe positions is used to provide alternatives in case the potential field algorithm encounters a local minimum and thus fails to reach the goal. The best first search algorithm A* is used for graph search. Symmetry properties of the payload and equivalent rotations are exploited to further enlarge the number of alternatives passed to the potential field algorithm.

  3. Examining the ethnoracial invariance of a bifactor model of anxiety sensitivity and the incremental validity of the physical domain-specific factor in a primary-care patient sample.

    PubMed

    Fergus, Thomas A; Kelley, Lance P; Griggs, Jackson O

    2017-10-01

    There is growing support for a bifactor conceptualization of the Anxiety Sensitivity Index-3 (ASI-3; Taylor et al., 2007), consisting of a General factor and 3 domain-specific factors (i.e., Physical, Cognitive, Social). Earlier studies supporting a bifactor model of the ASI-3 used samples that consisted of predominantly White respondents. In addition, extant research has yet to support the incremental validity of the Physical domain-specific factor while controlling for the General factor. The present study is an examination of a bifactor model of the ASI-3 and the measurement invariance of that model among an ethnoracially diverse sample of primary-care patients (N = 533). Results from multiple-group confirmatory factor analysis supported the configural and metric/scalar invariance of the bifactor model of the ASI-3 across self-identifying Black, Latino, and White respondents. The Physical domain-specific factor accounted for unique variance in an index of health anxiety beyond the General factor. These results provide support for the generalizability of a bifactor model of the ASI-3 across 3 ethnoracial groups, as well as indication of the incremental explanatory power of the Physical domain-specific factor. Study implications are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Statistically Controlling for Confounding Constructs Is Harder than You Think

    PubMed Central

    Westfall, Jacob; Yarkoni, Tal

    2016-01-01

    Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707

  5. Volatilities, Traded Volumes, and Price Increments in Derivative Securities

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Lim, Gyuchang; Kim, Soo Yong; Scalas, Enrico

    2007-03-01

    We apply the detrended fluctuation analysis (DFA) to the statistics of the Korean treasury bond (KTB) futures from which the logarithmic increments, volatilities, and traded volumes are estimated over a specific time lag. For our case, the logarithmic increment of futures prices has no long-memory property, while the volatility and the traded volume exhibit the existence of long-memory property. To analyze and calculate whether the volatility clustering is due to the inherent higher-order correlation not detected by applying directly the DFA to logarithmic increments of the KTB futures, it is of importance to shuffle the original tick data of futures prices and to generate the geometric Brownian random walk with the same mean and standard deviation. It is really shown from comparing the three tick data that the higher-order correlation inherent in logarithmic increments makes the volatility clustering. Particularly, the result of the DFA on volatilities and traded volumes may be supported the hypothesis of price changes.

  6. Volatilities, traded volumes, and the hypothesis of price increments in derivative securities

    NASA Astrophysics Data System (ADS)

    Lim, Gyuchang; Kim, SooYong; Scalas, Enrico; Kim, Kyungsik

    2007-08-01

    A detrended fluctuation analysis (DFA) is applied to the statistics of Korean treasury bond (KTB) futures from which the logarithmic increments, volatilities, and traded volumes are estimated over a specific time lag. In this study, the logarithmic increment of futures prices has no long-memory property, while the volatility and the traded volume exhibit the existence of the long-memory property. To analyze and calculate whether the volatility clustering is due to a inherent higher-order correlation not detected by with the direct application of the DFA to logarithmic increments of KTB futures, it is of importance to shuffle the original tick data of future prices and to generate a geometric Brownian random walk with the same mean and standard deviation. It was found from a comparison of the three tick data that the higher-order correlation inherent in logarithmic increments leads to volatility clustering. Particularly, the result of the DFA on volatilities and traded volumes can be supported by the hypothesis of price changes.

  7. Brief report: The Brief Alcohol Social Density Assessment (BASDA): convergent, criterion-related, and incremental validity.

    PubMed

    MacKillop, James; Acker, John D; Bollinger, Jared; Clifton, Allan; Miller, Joshua D; Campbell, W Keith; Goodie, Adam S

    2013-09-01

    Alcohol misuse is substantially influenced by social factors, but systematic assessments of social network drinking are typically lengthy. The goal of the present study was to provide further validation of a brief measure of social network alcohol use, the Brief Alcohol Social Density Assessment (BASDA), in a sample of emerging adults. Specifically, the study sought to examine the BASDA's convergent, criterion, and incremental validity in relation to well-established measures of drinking motives and problematic drinking. Participants were 354 undergraduates who were assessed using the BASDA, the Alcohol Use Disorders Identification Test (AUDIT), and the Drinking Motives Questionnaire. Significant associations were observed between the BASDA index of alcohol-related social density and alcohol misuse, social motives, and conformity motives, supporting convergent validity. Criterion-related validity was supported by evidence that significantly greater alcohol involvement was present in the social networks of individuals scoring at or above an AUDIT score of 8, a validated criterion for hazardous drinking. Finally, the BASDA index was significantly associated with alcohol misuse above and beyond drinking motives in relation to AUDIT scores, supporting incremental validity. Taken together, these findings provide further support for the BASDA as an efficient measure of drinking in an individual's social network. Methodological considerations as well as recommendations for future investigations in this area are discussed.

  8. Incremental short daily home hemodialysis: a case series.

    PubMed

    Toth-Manikowski, Stephanie M; Mullangi, Surekha; Hwang, Seungyoung; Shafi, Tariq

    2017-07-05

    Patients starting dialysis often have substantial residual kidney function. Incremental hemodialysis provides a hemodialysis prescription that supplements patients' residual kidney function while maintaining total (residual + dialysis) urea clearance (standard Kt/Vurea) targets. We describe our experience with incremental hemodialysis in patients using NxStage System One for home hemodialysis. From 2011 to 2015, we initiated 5 incident hemodialysis patients on an incremental home hemodialysis regimen. The biochemical parameters of all patients remained stable on the incremental hemodialysis regimen and they consistently achieved standard Kt/Vurea targets. Of the two patients with follow-up >6 months, residual kidney function was preserved for ≥2 years. Importantly, the patients were able to transition to home hemodialysis without automatically requiring 5 sessions per week at the outset and gradually increased the number of treatments and/or dialysate volume as the residual kidney function declined. An incremental home hemodialysis regimen can be safely prescribed and may improve acceptability of home hemodialysis. Reducing hemodialysis frequency by even one treatment per week can reduce the number of fistula or graft cannulations or catheter connections by >100 per year, an important consideration for patient well-being, access longevity, and access-related infections. The incremental hemodialysis approach, supported by national guidelines, can be considered for all home hemodialysis patients with residual kidney function.

  9. SAIL: Summation-bAsed Incremental Learning for Information-Theoretic Text Clustering.

    PubMed

    Cao, Jie; Wu, Zhiang; Wu, Junjie; Xiong, Hui

    2013-04-01

    Information-theoretic clustering aims to exploit information-theoretic measures as the clustering criteria. A common practice on this topic is the so-called Info-Kmeans, which performs K-means clustering with KL-divergence as the proximity function. While expert efforts on Info-Kmeans have shown promising results, a remaining challenge is to deal with high-dimensional sparse data such as text corpora. Indeed, it is possible that the centroids contain many zero-value features for high-dimensional text vectors, which leads to infinite KL-divergence values and creates a dilemma in assigning objects to centroids during the iteration process of Info-Kmeans. To meet this challenge, in this paper, we propose a Summation-bAsed Incremental Learning (SAIL) algorithm for Info-Kmeans clustering. Specifically, by using an equivalent objective function, SAIL replaces the computation of KL-divergence by the incremental computation of Shannon entropy. This can avoid the zero-feature dilemma caused by the use of KL-divergence. To improve the clustering quality, we further introduce the variable neighborhood search scheme and propose the V-SAIL algorithm, which is then accelerated by a multithreaded scheme in PV-SAIL. Our experimental results on various real-world text collections have shown that, with SAIL as a booster, the clustering performance of Info-Kmeans can be significantly improved. Also, V-SAIL and PV-SAIL indeed help improve the clustering quality at a lower cost of computation.

  10. Costs of dengue prevention and incremental cost of dengue outbreak control in Guantanamo, Cuba.

    PubMed

    Baly, Alberto; Toledo, Maria E; Rodriguez, Karina; Benitez, Juan R; Rodriguez, Maritza; Boelaert, Marleen; Vanlerberghe, Veerle; Van der Stuyft, Patrick

    2012-01-01

    To assess the economic cost of routine Aedes aegypti control in an at-risk environment without dengue endemicity and the incremental costs incurred during a sporadic outbreak. The study was conducted in 2006 in the city of Guantanamo, Cuba. We took a societal perspective to calculate costs in months without dengue transmission (January-July) and during an outbreak (August-December). Data sources were bookkeeping records, direct observations and interviews. The total economic cost per inhabitant (p.i.) per month. (p.m.) increased from 2.76 USD in months without dengue transmission to 6.05 USD during an outbreak. In months without transmission, the routine Aedes control programme cost 1.67 USD p.i. p.m. Incremental costs during the outbreak were mainly incurred by the population and the primary/secondary level of the healthcare system, hardly by the vector control programme (1.64, 1.44 and 0.21 UDS increment p.i. p.m., respectively). The total cost for managing a hospitalized suspected dengue case was 296.60 USD (62.0% direct medical, 9.0% direct non-medical and 29.0% indirect costs). In both periods, the main cost drivers for the Aedes control programme, the healthcare system and the community were the value of personnel and volunteer time or productivity losses. Intensive efforts to keep A. aegypti infestation low entail important economic costs for society. When a dengue outbreak does occur eventually, costs increase sharply. In-depth studies should assess which mix of activities and actors could maximize the effectiveness and cost-effectiveness of routine Aedes control and dengue prevention. © 2011 Blackwell Publishing Ltd.

  11. SAMS Acceleration Measurements on Mir from May 1997 to June 1998 (NASA Increments 5, 6, and 7)

    NASA Technical Reports Server (NTRS)

    DeLombard, Richard

    1999-01-01

    During NASA Increments 5, 6, and 7 (May 1997 to June 1998), about eight gigabytes of acceleration data were collected by the Space Acceleration Measurement System (SAMS) onboard the Russian Space Station Mir. The data were recorded on twenty-seven optical disks which were returned to Earth on Orbiter missions STS-86, STS-89, and STS-91. During these increments, SAMS data were collected in the Priroda module to support various microgravity experiments. This report points out some of the salient features of the microgravity acceleration environment to which the experiments were exposed. This report presents an overview of the SAMS acceleration measurements recorded by 10 Hz and 100 Hz sensor heads. The analyses included herein complement those presented in previous Mir increment summary reports prepared by the Principal Investigator Microgravity Services project.

  12. Incremental Validity of the Trait Emotional Intelligence Questionnaire-Short Form (TEIQue-SF).

    PubMed

    Siegling, A B; Vesely, Ashley K; Petrides, K V; Saklofske, Donald H

    2015-01-01

    This study examined the incremental validity of the adult short form of the Trait Emotional Intelligence Questionnaire (TEIQue-SF) in predicting 7 construct-relevant criteria beyond the variance explained by the Five-factor model and coping strategies. Additionally, the relative contributions of the questionnaire's 4 subscales were assessed. Two samples of Canadian university students completed the TEIQue-SF, along with measures of the Big Five, coping strategies (Sample 1 only), and emotion-laden criteria. The TEIQue-SF showed consistent incremental effects beyond the Big Five or the Big Five and coping strategies, predicting all 7 criteria examined across the 2 samples. Furthermore, 2 of the 4 TEIQue-SF subscales accounted for the measure's incremental validity. Although the findings provide good support for the validity and utility of the TEIQue-SF, directions for further research are emphasized.

  13. Analysis and forecast experiments incorporating satellite soundings and cloud and water vapor drift wind information

    NASA Technical Reports Server (NTRS)

    Goodman, Brian M.; Diak, George R.; Mills, Graham A.

    1986-01-01

    A system for assimilating conventional meteorological data and satellite-derived data in order to produce four-dimensional gridded data sets of the primary atmospheric variables used for updating limited area forecast models is described. The basic principles of a data assimilation scheme as proposed by Lorenc (1984) are discussed. The design of the system and its incremental assimilation cycles are schematically presented. The assimilation system was tested using radiosonde, buoy, VAS temperature, dew point, gradient wind data, cloud drift, and water vapor motion data. The rms vector errors for the data are analyzed.

  14. Vector-model-supported approach in prostate plan optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100more » previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration number without compromising the plan quality.« less

  15. A helper virus-free HSV-1 vector containing the vesicular glutamate transporter-1 promoter supports expression preferentially in VGLUT1-containing glutamatergic neurons.

    PubMed

    Zhang, Guo-rong; Geller, Alfred I

    2010-05-17

    Multiple potential uses of direct gene transfer into neurons require restricting expression to specific classes of glutamatergic neurons. Thus, it is desirable to develop vectors containing glutamatergic class-specific promoters. The three vesicular glutamate transporters (VGLUTs) are expressed in distinct populations of neurons, and VGLUT1 is the predominant VGLUT in the neocortex, hippocampus, and cerebellar cortex. We previously reported a plasmid (amplicon) Herpes Simplex Virus (HSV-1) vector that placed the Lac Z gene under the regulation of the VGLUT1 promoter (pVGLUT1lac). Using helper virus-free vector stocks, we showed that this vector supported approximately 90% glutamatergic neuron-specific expression in postrhinal (POR) cortex, in rats sacrificed at either 4 days or 2 months after gene transfer. We now show that pVGLUT1lac supports expression preferentially in VGLUT1-containing glutamatergic neurons. pVGLUT1lac vector stock was injected into either POR cortex, which contains primarily VGLUT1-containing glutamatergic neurons, or into the ventral medial hypothalamus (VMH), which contains predominantly VGLUT2-containing glutamatergic neurons. Rats were sacrificed at 4 days after gene transfer, and the types of cells expressing ss-galactosidase were determined by immunofluorescent costaining. Cell counts showed that pVGLUT1lac supported expression in approximately 10-fold more cells in POR cortex than in the VMH, whereas a control vector supported expression in similar numbers of cells in these two areas. Further, in POR cortex, pVGLUT1lac supported expression predominately in VGLUT1-containing neurons, and, in the VMH, pVGLUT1lac showed an approximately 10-fold preference for the rare VGLUT1-containing neurons. VGLUT1-specific expression may benefit specific experiments on learning or specific gene therapy approaches, particularly in the neocortex. Copyright 2010 Elsevier B.V. All rights reserved.

  16. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  17. Turbo-Satori: a neurofeedback and brain-computer interface toolbox for real-time functional near-infrared spectroscopy.

    PubMed

    Lührs, Michael; Goebel, Rainer

    2017-10-01

    Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.

  18. Simulation comparison of a decoupled longitudinal control system and a velocity vector control wheel steering system during landings in wind shear

    NASA Technical Reports Server (NTRS)

    Kimball, G., Jr.

    1980-01-01

    A simulator comparison of the velocity vector control wheel steering (VCWS) system and a decoupled longitudinal control system is presented. The piloting task was to use the electronic attitude direction indicator (EADI) to capture and maintain a 3 degree glide slope in the presence of wind shear and to complete the landing using the perspective runway included on the EADI. The decoupled control system used constant prefilter and feedback gains to provide steady state decoupling of flight path angle, pitch angle, and forward velocity. The decoupled control system improved the pilots' ability to control airspeed and flight path angle during the final stages of an approach made in severe wind shear. The system also improved their ability to complete safe landings. The pilots preferred the decoupled control system in severe winds and, on a pilot rating scale, rated the approach and landing task with the decoupled control system as much as 3 to 4 increments better than use of the VCWS system.

  19. A hybrid approach to select features and classify diseases based on medical data

    NASA Astrophysics Data System (ADS)

    AbdelLatif, Hisham; Luo, Jiawei

    2018-03-01

    Feature selection is popular problem in the classification of diseases in clinical medicine. Here, we developing a hybrid methodology to classify diseases, based on three medical datasets, Arrhythmia, Breast cancer, and Hepatitis datasets. This methodology called k-means ANOVA Support Vector Machine (K-ANOVA-SVM) uses K-means cluster with ANOVA statistical to preprocessing data and selection the significant features, and Support Vector Machines in the classification process. To compare and evaluate the performance, we choice three classification algorithms, decision tree Naïve Bayes, Support Vector Machines and applied the medical datasets direct to these algorithms. Our methodology was a much better classification accuracy is given of 98% in Arrhythmia datasets, 92% in Breast cancer datasets and 88% in Hepatitis datasets, Compare to use the medical data directly with decision tree Naïve Bayes, and Support Vector Machines. Also, the ROC curve and precision with (K-ANOVA-SVM) Achieved best results than other algorithms

  20. Alpharetroviral Self-inactivating Vectors: Long-term Transgene Expression in Murine Hematopoietic Cells and Low Genotoxicity

    PubMed Central

    Suerth, Julia D; Maetzig, Tobias; Brugman, Martijn H; Heinz, Niels; Appelt, Jens-Uwe; Kaufmann, Kerstin B; Schmidt, Manfred; Grez, Manuel; Modlich, Ute; Baum, Christopher; Schambach, Axel

    2012-01-01

    Comparative integrome analyses have highlighted alpharetroviral vectors with a relatively neutral, and thus favorable, integration spectrum. However, previous studies used alpharetroviral vectors harboring viral coding sequences and intact long-terminal repeats (LTRs). We recently developed self-inactivating (SIN) alpharetroviral vectors with an advanced split-packaging design. In a murine bone marrow (BM) transplantation model we now compared alpharetroviral, gammaretroviral, and lentiviral SIN vectors and showed that all vectors transduced hematopoietic stem cells (HSCs), leading to comparable, sustained multilineage transgene expression in primary and secondary transplanted mice. Alpharetroviral integrations were decreased near transcription start sites, CpG islands, and potential cancer genes compared with gammaretroviral, and decreased in genes compared with lentiviral integrations. Analyzing the transcriptome and intragenic integrations in engrafting cells, we observed stronger correlations between in-gene integration targeting and transcriptional activity for gammaretroviral and lentiviral vectors than for alpharetroviral vectors. Importantly, the relatively “extragenic” alpharetroviral integration pattern still supported long-term transgene expression upon serial transplantation. Furthermore, sensitive genotoxicity studies revealed a decreased immortalization incidence compared with gammaretroviral and lentiviral SIN vectors. We conclude that alpharetroviral SIN vectors have a favorable integration pattern which lowers the risk of insertional mutagenesis while supporting long-term transgene expression in the progeny of transplanted HSCs. PMID:22334016

  1. Are financial incentives cost-effective to support smoking cessation during pregnancy?

    PubMed

    Boyd, Kathleen A; Briggs, Andrew H; Bauld, Linda; Sinclair, Lesley; Tappin, David

    2016-02-01

    To investigate the cost-effectiveness of up to £400 worth of financial incentives for smoking cessation in pregnancy as an adjunct to routine health care. Cost-effectiveness analysis based on a Phase II randomized controlled trial (RCT) and a cost-utility analysis using a life-time Markov model. The RCT was undertaken in Glasgow, Scotland. The economic analysis was undertaken from the UK National Health Service (NHS) perspective. A total of 612 pregnant women randomized to receive usual cessation support plus or minus financial incentives of up to £400 vouchers (US $609), contingent upon smoking cessation. Comparison of usual support and incentive interventions in terms of cotinine-validated quitters, quality-adjusted life years (QALYs) and direct costs to the NHS. The incremental cost per quitter at 34-38 weeks pregnant was £1127 ($1716).This is similar to the standard look-up value derived from Stapleton & West's published ICER tables, £1390 per quitter, by looking up the Cessation in Pregnancy Incentives Trial (CIPT) incremental cost (£157) and incremental 6-month quit outcome (0.14). The life-time model resulted in an incremental cost of £17 [95% confidence interval (CI) = -£93, £107] and a gain of 0.04 QALYs (95% CI = -0.058, 0.145), giving an ICER of £482/QALY ($734/QALY). Probabilistic sensitivity analysis indicates uncertainty in these results, particularly regarding relapse after birth. The expected value of perfect information was £30 million (at a willingness to pay of £30 000/QALY), so given current uncertainty, additional research is potentially worthwhile. Financial incentives for smoking cessation in pregnancy are highly cost-effective, with an incremental cost per quality-adjusted life years of £482, which is well below recommended decision thresholds. © 2015 Society for the Study of Addiction.

  2. Summary of the Science performed onboard the International Space Station during Increments 12 and 13

    NASA Technical Reports Server (NTRS)

    Jules, Kenol

    2007-01-01

    By September of 2007, continuous human presence on the International Space Station will reach a milestone of eighty months. The many astronauts and cosmonauts, who live onboard the station during the last fourteen Increments over that time span, spend their time building the station as well as performing science on a daily basis. Over those eighty months, the U.S astronauts crew members logged over 2954 hours of research time. Far more research time has been accumulated by experiments controlled by investigators on the ground. The U.S astronauts conducted over one hundred and twenty six (126) science investigations. From these hundred and twenty six science investigations, many were operated across multiple Increments. The crew also installed, activated and operated nine (9) science racks that supported six science disciplines ranging from material sciences to life science. By the end of Increment 14, a total of 5083 kg of research rack mass were ferried to the station as well as 5021 kg of research mass. The objectives of this paper are three-fold. (1) To briefly review the science conducted on the International Space Station during the previous eleven Increments; (2) to discuss in detail the science investigations that were conducted on the station during Increments 12 and 13. The discussion will focus mainly on the primary objectives of each investigation and their associated hypotheses that were investigated during these two Increments. Also, some preliminary science results will be discussed for each of the investigation as science results availability permit. (3) The paper will briefly touch on what the science complement planning was and what was actually accomplished due to real time science implementation and challenges during these two Increments in question to illustrate the challenges of daily science activity while the science platform is under construction. Finally, the paper will briefly discuss the science research complements for the other two Increments, Increments 14 and 15, to preview how much science might be accomplished during these two Increments.

  3. A Conceptual Model of Structured Support in Physical Education

    ERIC Educational Resources Information Center

    Hinton, Vanessa; Buchanan, Alice M.; Rudisill, Mary

    2016-01-01

    Schools implement Positive Behavior Intervention and Supports (PBIS) as a way of meeting students' needs in classrooms. PBIS focuses on tiered instruction. Tiered instruction is a teaching strategy in which the educator implements incremental changes that increase supports based on students' needs--academic or behavioral. Yet, tiered instruction…

  4. Identifying saltcedar with hyperspectral data and support vector machines

    USDA-ARS?s Scientific Manuscript database

    Saltcedar (Tamarix spp.) are a group of dense phreatophytic shrubs and trees that are invasive to riparian areas throughout the United States. This study determined the feasibility of using hyperspectral data and a support vector machine (SVM) classifier to discriminate saltcedar from other cover t...

  5. Single and multiple object tracking using log-euclidean Riemannian subspace and block-division appearance model.

    PubMed

    Hu, Weiming; Li, Xi; Luo, Wenhan; Zhang, Xiaoqin; Maybank, Stephen; Zhang, Zhongfei

    2012-12-01

    Object appearance modeling is crucial for tracking objects, especially in videos captured by nonstationary cameras and for reasoning about occlusions between multiple moving objects. Based on the log-euclidean Riemannian metric on symmetric positive definite matrices, we propose an incremental log-euclidean Riemannian subspace learning algorithm in which covariance matrices of image features are mapped into a vector space with the log-euclidean Riemannian metric. Based on the subspace learning algorithm, we develop a log-euclidean block-division appearance model which captures both the global and local spatial layout information about object appearances. Single object tracking and multi-object tracking with occlusion reasoning are then achieved by particle filtering-based Bayesian state inference. During tracking, incremental updating of the log-euclidean block-division appearance model captures changes in object appearance. For multi-object tracking, the appearance models of the objects can be updated even in the presence of occlusions. Experimental results demonstrate that the proposed tracking algorithm obtains more accurate results than six state-of-the-art tracking algorithms.

  6. Comparing machine learning and logistic regression methods for predicting hypertension using a combination of gene expression and next-generation sequencing data.

    PubMed

    Held, Elizabeth; Cape, Joshua; Tintle, Nathan

    2016-01-01

    Machine learning methods continue to show promise in the analysis of data from genetic association studies because of the high number of variables relative to the number of observations. However, few best practices exist for the application of these methods. We extend a recently proposed supervised machine learning approach for predicting disease risk by genotypes to be able to incorporate gene expression data and rare variants. We then apply 2 different versions of the approach (radial and linear support vector machines) to simulated data from Genetic Analysis Workshop 19 and compare performance to logistic regression. Method performance was not radically different across the 3 methods, although the linear support vector machine tended to show small gains in predictive ability relative to a radial support vector machine and logistic regression. Importantly, as the number of genes in the models was increased, even when those genes contained causal rare variants, model predictive ability showed a statistically significant decrease in performance for both the radial support vector machine and logistic regression. The linear support vector machine showed more robust performance to the inclusion of additional genes. Further work is needed to evaluate machine learning approaches on larger samples and to evaluate the relative improvement in model prediction from the incorporation of gene expression data.

  7. Applying spectral unmixing and support vector machine to airborne hyperspectral imagery for detecting giant reed

    USDA-ARS?s Scientific Manuscript database

    This study evaluated linear spectral unmixing (LSU), mixture tuned matched filtering (MTMF) and support vector machine (SVM) techniques for detecting and mapping giant reed (Arundo donax L.), an invasive weed that presents a severe threat to agroecosystems and riparian areas throughout the southern ...

  8. Support vector machines classifiers of physical activities in preschoolers

    USDA-ARS?s Scientific Manuscript database

    The goal of this study is to develop, test, and compare multinomial logistic regression (MLR) and support vector machines (SVM) in classifying preschool-aged children physical activity data acquired from an accelerometer. In this study, 69 children aged 3-5 years old were asked to participate in a s...

  9. Fabric wrinkle characterization and classification using modified wavelet coefficients and optimized support-vector-machine classifier

    USDA-ARS?s Scientific Manuscript database

    This paper presents a novel wrinkle evaluation method that uses modified wavelet coefficients and an optimized support-vector-machine (SVM) classification scheme to characterize and classify wrinkle appearance of fabric. Fabric images were decomposed with the wavelet transform (WT), and five parame...

  10. Comparison of Support Vector Machine, Neural Network, and CART Algorithms for the Land-Cover Classification Using Limited Training Data Points

    EPA Science Inventory

    Support vector machine (SVM) was applied for land-cover characterization using MODIS time-series data. Classification performance was examined with respect to training sample size, sample variability, and landscape homogeneity (purity). The results were compared to two convention...

  11. Entomological efficacy of durable wall lining with reduced wall surface coverage for strengthening visceral leishmaniasis vector control in Bangladesh, India and Nepal.

    PubMed

    Huda, M Mamun; Kumar, Vijay; Das, Murari Lal; Ghosh, Debashis; Priyanka, Jyoti; Das, Pradeep; Alim, Abdul; Matlashewski, Greg; Kroeger, Axel; Alfonso-Sierra, Eduardo; Mondal, Dinesh

    2016-10-06

    New methods for controlling sand fly are highly desired by the Visceral Leishmaniasis (VL) elimination program of Bangladesh, India and Nepal for its consolidation and maintenance phases. To support the program we investigated safety, efficacy and cost of Durable Wall Lining to control sand fly. This multicentre randomized controlled study in Bangladesh, India and Nepal included randomized two intervention clusters and one control cluster. Each cluster had 50 households except full wall surface coverage (DWL-FWSC) cluster in Nepal which had 46 households. Ten of 50 households were randomly selected for entomological activities except India where it was 6 households. Interventions were DWL-FWSC and reduced wall surface coverage (DWL-RWSC) with DWL which covers 1.8 m and 1.5 m height from floor respectively. Efficacy was measured by reduction in sand fly density by intervention and sand fly mortality assessment by the WHO cone bioassay test at 1 month after intervention. Trained field research assistants interviewed household heads for socio-demographic information, knowledge and practice about VL, vector control, and for their experience following the intervention. Cost data was collected using cost data collection tool which was designed for this study. Statistical analysis included difference-in-differences estimate, bivariate analysis, Poisson regression model and incremental cost-efficacy ratio calculation. Mean sand fly density reduction by DWL-FWSC and DWL-RWSC was respectively -4.96 (95 % CI, -4.54, -5.38) and -5.38 (95 % CI, -4.89, -5.88). The sand fly density reduction attributed by both the interventions were statistically significant after adjusting for covariates (IRR = 0.277, p < 0.001 for DWL-RWSC and IRR = 0.371, p < 0.001 for DWL-FWSC). The efficacy of DWL-RWSC and DWL-FWSC on sand fly density reduction was statistically comparable (p = 0.214). The acceptability of both interventions was high. Transient burning sensations, flash on face and itching were most common adverse events and were observed mostly in Indian site. There was no serious adverse event. DWL-RWSC is cost-saving compared to DWL-FWSC. The incremental cost-efficacy ratio was -6.36, where DWL-RWSC dominates DWL-FWSC. DWL-RWSC intervention is safe, efficacious, cost-saving and cost-effective in reducing indoor sand fly density. The VL elimination program in the Indian sub-continent may consider DWL-RWSC for sand fly control for its consolidation and maintenance phases.

  12. Simulation of Ventricular, Cavo-Pulmonary, and Biventricular Ventricular Assist Devices in Failing Fontan.

    PubMed

    Di Molfetta, Arianna; Amodeo, Antonio; Fresiello, Libera; Trivella, Maria Giovanna; Iacobelli, Roberta; Pilati, Mara; Ferrari, Gianfranco

    2015-07-01

    Considering the lack of donors, ventricular assist devices (VADs) could be an alternative to heart transplantation for failing Fontan patients, in spite of the lack of experience and the complex anatomy and physiopathology of these patients. Considering the high number of variables that play an important role such as type of Fontan failure, type of VAD connection, and setting (right VAD [RVAD], left VAD [LVAD], or biventricular VAD [BIVAD]), a numerical model could be useful to support clinical decisions. The aim of this article is to develop and test a lumped parameter model of the cardiovascular system simulating and comparing the VAD effects on failing Fontan. Hemodynamic and echocardiographic data of 10 Fontan patients were used to simulate the baseline patients' condition using a dedicated lumped parameter model. Starting from the simulated baseline and for each patient, a systolic dysfunction, a diastolic dysfunction, and an increment of the pulmonary vascular resistance were simulated. Then, for each patient and for each pathology, the RVAD, LVAD, and BIVAD implantations were simulated. The model can reproduce patients' baseline well. In the case of systolic dysfunction, the LVAD unloads the single ventricle and increases the cardiac output (CO) (35%) and the arterial systemic pressure (Pas) (25%). With RVAD, a decrement of inferior vena cava pressure (Pvci) (39%) was observed with 34% increment of CO, but an increment of the single ventricle external work (SVEW). With the BIVAD, an increment of Pas (29%) and CO (37%) was observed. In the case of diastolic dysfunction, the LVAD increases CO (42%) and the RVAD decreases the Pvci, while both increase the SVEW. In the case of pulmonary vascular resistance increment, the highest CO (50%) and Pas (28%) increment is obtained with an RVAD with the highest decrement of Pvci (53%) and an increment of the SVEW but with the lowest VAD power consumption. The use of numerical models could be helpful in this innovative field to evaluate the effect of VAD implantation on Fontan patients to support patient and VAD type selection personalizing the assistance. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  13. Vector-model-supported optimization in volumetric-modulated arc stereotactic radiotherapy planning for brain metastasis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Long planning time in volumetric-modulated arc stereotactic radiotherapy (VMA-SRT) cases can limit its clinical efficiency and use. A vector model could retrieve previously successful radiotherapy cases that share various common anatomic features with the current case. The prsent study aimed to develop a vector model that could reduce planning time by applying the optimization parameters from those retrieved reference cases. Thirty-six VMA-SRT cases of brain metastasis (gender, male [n = 23], female [n = 13]; age range, 32 to 81 years old) were collected and used as a reference database. Another 10 VMA-SRT cases were planned with both conventional optimization and vector-model-supported optimization, followingmore » the oncologists' clinical dose prescriptions. Planning time and plan quality measures were compared using the 2-sided paired Wilcoxon signed rank test with a significance level of 0.05, with positive false discovery rate (pFDR) of less than 0.05. With vector-model-supported optimization, there was a significant reduction in the median planning time, a 40% reduction from 3.7 to 2.2 hours (p = 0.002, pFDR = 0.032), and for the number of iterations, a 30% reduction from 8.5 to 6.0 (p = 0.006, pFDR = 0.047). The quality of plans from both approaches was comparable. From these preliminary results, vector-model-supported optimization can expedite the optimization of VMA-SRT for brain metastasis while maintaining plan quality.« less

  14. PlasmoGEM, a database supporting a community resource for large-scale experimental genetics in malaria parasites.

    PubMed

    Schwach, Frank; Bushell, Ellen; Gomes, Ana Rita; Anar, Burcu; Girling, Gareth; Herd, Colin; Rayner, Julian C; Billker, Oliver

    2015-01-01

    The Plasmodium Genetic Modification (PlasmoGEM) database (http://plasmogem.sanger.ac.uk) provides access to a resource of modular, versatile and adaptable vectors for genome modification of Plasmodium spp. parasites. PlasmoGEM currently consists of >2000 plasmids designed to modify the genome of Plasmodium berghei, a malaria parasite of rodents, which can be requested by non-profit research organisations free of charge. PlasmoGEM vectors are designed with long homology arms for efficient genome integration and carry gene specific barcodes to identify individual mutants. They can be used for a wide array of applications, including protein localisation, gene interaction studies and high-throughput genetic screens. The vector production pipeline is supported by a custom software suite that automates both the vector design process and quality control by full-length sequencing of the finished vectors. The PlasmoGEM web interface allows users to search a database of finished knock-out and gene tagging vectors, view details of their designs, download vector sequence in different formats and view available quality control data as well as suggested genotyping strategies. We also make gDNA library clones and intermediate vectors available for researchers to produce vectors for themselves. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. 76 FR 40707 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-11

    ... training and training equipment, support equipment, U.S. Government and contractor engineering, logistics... training equipment, support equipment, U.S. Government and contractor engineering, logistics, and technical... access to SSEE Increment F services via standard Service Oriented Architecture (SOA) interfaces via...

  16. Data Assimilation of Lightning using 1D+3D/4D WRF Var Assimilation Schemes with Non-Linear Observation Operators

    NASA Astrophysics Data System (ADS)

    Navon, M. I.; Stefanescu, R.; Fuelberg, H. E.; Marchand, M.

    2012-12-01

    NASA's launch of the GOES-R Lightning Mapper (GLM) in 2015 will provide continuous, full disc, high resolution total lightning (IC + CG) data. The data will be available at a horizontal resolution of approximately 9 km. Compared to other types of data, the assimilation of lightning data into operational numerical models has received relatively little attention. Previous efforts of lightning assimilation mostly have employed nudging. This paper will describe the implementation of 1D+3D/4D Var assimilation schemes of existing ground-based WTLN (Worldwide Total Lightning Network) lightning observations using non-linear observation operators in the incremental WRFDA system. To mimic the expected output of GLM, the WTLN data were used to generate lightning super-observations characterized by flash rates/81 km2/20 min. A major difficulty associated with variational approaches is the complexity of the observation operator that defines the model equivalent of lightning. We use Convective Available Potential Energy (CAPE) as a proxy between lightning data and model variables. This operator is highly nonlinear. Marecal and Mahfouf (2003) have shown that nonlinearities can prevent direct assimilation of rainfall rates in the ECMWF 4D-VAR (using the incremental formulation proposed by Courtier et al. (1994)) from being successful. Using data from the 2011 Tuscaloosa, AL tornado outbreak, we have proved that the direct assimilation of lightning data into the WRF 3D/4D - Var systems is limited due to this incremental approach. Severe threshold limits must be imposed on the innovation vectors to obtain an improved analysis. We have implemented 1D+3D/4D Var schemes to assimilate lightning observations into the WRF model. Their use avoids innovation vector constrains from preventing the inclusion of a greater number of lightning observations Their use also minimizes the problem that nonlinearities in the moist convective scheme can introduce discontinuities in the cost function between inner and outer loops of the incremental 3-D/4-D VAR minimization. The first part of this paper will describe the methodology and performance analysis of the 1D-Var retrieval scheme that adjusts the WRF temperature profiles closer to an observed value as in Mahfouf et al. (2005). The second part will show the positive impact of these 1D-Var pseudo - temperature observations on both model 3D/4D-Var WRF analyses and short-range forecasts for three cases - the Tuscaloosa tornado outbreak (April 27, 2011) with intense but localized lightning, a second severe storm outbreak with more widespread but less intense lightning (June 27, 2011), and a northeaster containing much less lightning.

  17. 1-norm support vector novelty detection and its sparseness.

    PubMed

    Zhang, Li; Zhou, WeiDa

    2013-12-01

    This paper proposes a 1-norm support vector novelty detection (SVND) method and discusses its sparseness. 1-norm SVND is formulated as a linear programming problem and uses two techniques for inducing sparseness, or the 1-norm regularization and the hinge loss function. We also find two upper bounds on the sparseness of 1-norm SVND, or exact support vector (ESV) and kernel Gram matrix rank bounds. The ESV bound indicates that 1-norm SVND has a sparser representation model than SVND. The kernel Gram matrix rank bound can loosely estimate the sparseness of 1-norm SVND. Experimental results show that 1-norm SVND is feasible and effective. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. ℓ(p)-Norm multikernel learning approach for stock market price forecasting.

    PubMed

    Shao, Xigao; Wu, Kun; Liao, Bifeng

    2012-01-01

    Linear multiple kernel learning model has been used for predicting financial time series. However, ℓ(1)-norm multiple support vector regression is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernel mixtures that generalize well, we adopt ℓ(p)-norm multiple kernel support vector regression (1 ≤ p < ∞) as a stock price prediction model. The optimization problem is decomposed into smaller subproblems, and the interleaved optimization strategy is employed to solve the regression model. The model is evaluated on forecasting the daily stock closing prices of Shanghai Stock Index in China. Experimental results show that our proposed model performs better than ℓ(1)-norm multiple support vector regression model.

  19. Applications of Support Vector Machine (SVM) Learning in Cancer Genomics

    PubMed Central

    HUANG, SHUJUN; CAI, NIANGUANG; PACHECO, PEDRO PENZUTI; NARANDES, SHAVIRA; WANG, YANG; XU, WAYNE

    2017-01-01

    Machine learning with maximization (support) of separating margin (vector), called support vector machine (SVM) learning, is a powerful classification tool that has been used for cancer genomic classification or subtyping. Today, as advancements in high-throughput technologies lead to production of large amounts of genomic and epigenomic data, the classification feature of SVMs is expanding its use in cancer genomics, leading to the discovery of new biomarkers, new drug targets, and a better understanding of cancer driver genes. Herein we reviewed the recent progress of SVMs in cancer genomic studies. We intend to comprehend the strength of the SVM learning and its future perspective in cancer genomic applications. PMID:29275361

  20. A Threshold Model of Social Support, Adjustment, and Distress after Breast Cancer Treatment

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Armer, Jane M.; Heppner, P. Paul

    2012-01-01

    This study examined a threshold model that proposes that social support exhibits a curvilinear association with adjustment and distress, such that support in excess of a critical threshold level has decreasing incremental benefits. Women diagnosed with a first occurrence of breast cancer (N = 154) completed survey measures of perceived support…

  1. Support vector machine applied to predict the zoonotic potential of E. coli O157 cattle isolates

    USDA-ARS?s Scientific Manuscript database

    Methods based on sequence data analysis facilitate the tracking of disease outbreaks, allow relationships between strains to be reconstructed and virulence factors to be identified. However, these methods are used postfactum after an outbreak has happened. Here, we show that support vector machine a...

  2. Prediction of Spirometric Forced Expiratory Volume (FEV1) Data Using Support Vector Regression

    NASA Astrophysics Data System (ADS)

    Kavitha, A.; Sujatha, C. M.; Ramakrishnan, S.

    2010-01-01

    In this work, prediction of forced expiratory volume in 1 second (FEV1) in pulmonary function test is carried out using the spirometer and support vector regression analysis. Pulmonary function data are measured with flow volume spirometer from volunteers (N=175) using a standard data acquisition protocol. The acquired data are then used to predict FEV1. Support vector machines with polynomial kernel function with four different orders were employed to predict the values of FEV1. The performance is evaluated by computing the average prediction accuracy for normal and abnormal cases. Results show that support vector machines are capable of predicting FEV1 in both normal and abnormal cases and the average prediction accuracy for normal subjects was higher than that of abnormal subjects. Accuracy in prediction was found to be high for a regularization constant of C=10. Since FEV1 is the most significant parameter in the analysis of spirometric data, it appears that this method of assessment is useful in diagnosing the pulmonary abnormalities with incomplete data and data with poor recording.

  3. Quantum Support Vector Machine for Big Data Classification

    NASA Astrophysics Data System (ADS)

    Rebentrost, Patrick; Mohseni, Masoud; Lloyd, Seth

    2014-09-01

    Supervised machine learning is the classification of new data based on already classified training examples. In this work, we show that the support vector machine, an optimized binary classifier, can be implemented on a quantum computer, with complexity logarithmic in the size of the vectors and the number of training examples. In cases where classical sampling algorithms require polynomial time, an exponential speedup is obtained. At the core of this quantum big data algorithm is a nonsparse matrix exponentiation technique for efficiently performing a matrix inversion of the training data inner-product (kernel) matrix.

  4. Correlation techniques and measurements of wave-height statistics

    NASA Technical Reports Server (NTRS)

    Guthart, H.; Taylor, W. C.; Graf, K. A.; Douglas, D. G.

    1972-01-01

    Statistical measurements of wave height fluctuations have been made in a wind wave tank. The power spectral density function of temporal wave height fluctuations evidenced second-harmonic components and an f to the minus 5th power law decay beyond the second harmonic. The observations of second harmonic effects agreed very well with a theoretical prediction. From the wave statistics, surface drift currents were inferred and compared to experimental measurements with satisfactory agreement. Measurements were made of the two dimensional correlation coefficient at 15 deg increments in angle with respect to the wind vector. An estimate of the two-dimensional spatial power spectral density function was also made.

  5. An Inquiry into the Cost of Post Deployment Software Support (PDSS)

    DTIC Science & Technology

    1989-09-01

    Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test

  6. Small Diameter Bomb Increment II (SDB II)

    DTIC Science & Technology

    2015-12-01

    Selected Acquisition Report ( SAR ) RCS: DD-A&T(Q&A)823-439 Small Diameter Bomb Increment II (SDB II) As of FY 2017 President’s Budget Defense...Acquisition Management Information Retrieval (DAMIR) March 23, 2016 16:19:13 UNCLASSIFIED SDB II December 2015 SAR March 23, 2016 16:19:13 UNCLASSIFIED...Document OSD - Office of the Secretary of Defense O&S - Operating and Support PAUC - Program Acquisition Unit Cost SDB II December 2015 SAR March 23

  7. Predicting complications of percutaneous coronary intervention using a novel support vector method.

    PubMed

    Lee, Gyemin; Gurm, Hitinder S; Syed, Zeeshan

    2013-01-01

    To explore the feasibility of a novel approach using an augmented one-class learning algorithm to model in-laboratory complications of percutaneous coronary intervention (PCI). Data from the Blue Cross Blue Shield of Michigan Cardiovascular Consortium (BMC2) multicenter registry for the years 2007 and 2008 (n=41 016) were used to train models to predict 13 different in-laboratory PCI complications using a novel one-plus-class support vector machine (OP-SVM) algorithm. The performance of these models in terms of discrimination and calibration was compared to the performance of models trained using the following classification algorithms on BMC2 data from 2009 (n=20 289): logistic regression (LR), one-class support vector machine classification (OC-SVM), and two-class support vector machine classification (TC-SVM). For the OP-SVM and TC-SVM approaches, variants of the algorithms with cost-sensitive weighting were also considered. The OP-SVM algorithm and its cost-sensitive variant achieved the highest area under the receiver operating characteristic curve for the majority of the PCI complications studied (eight cases). Similar improvements were observed for the Hosmer-Lemeshow χ(2) value (seven cases) and the mean cross-entropy error (eight cases). The OP-SVM algorithm based on an augmented one-class learning problem improved discrimination and calibration across different PCI complications relative to LR and traditional support vector machine classification. Such an approach may have value in a broader range of clinical domains.

  8. Predicting complications of percutaneous coronary intervention using a novel support vector method

    PubMed Central

    Lee, Gyemin; Gurm, Hitinder S; Syed, Zeeshan

    2013-01-01

    Objective To explore the feasibility of a novel approach using an augmented one-class learning algorithm to model in-laboratory complications of percutaneous coronary intervention (PCI). Materials and methods Data from the Blue Cross Blue Shield of Michigan Cardiovascular Consortium (BMC2) multicenter registry for the years 2007 and 2008 (n=41 016) were used to train models to predict 13 different in-laboratory PCI complications using a novel one-plus-class support vector machine (OP-SVM) algorithm. The performance of these models in terms of discrimination and calibration was compared to the performance of models trained using the following classification algorithms on BMC2 data from 2009 (n=20 289): logistic regression (LR), one-class support vector machine classification (OC-SVM), and two-class support vector machine classification (TC-SVM). For the OP-SVM and TC-SVM approaches, variants of the algorithms with cost-sensitive weighting were also considered. Results The OP-SVM algorithm and its cost-sensitive variant achieved the highest area under the receiver operating characteristic curve for the majority of the PCI complications studied (eight cases). Similar improvements were observed for the Hosmer–Lemeshow χ2 value (seven cases) and the mean cross-entropy error (eight cases). Conclusions The OP-SVM algorithm based on an augmented one-class learning problem improved discrimination and calibration across different PCI complications relative to LR and traditional support vector machine classification. Such an approach may have value in a broader range of clinical domains. PMID:23599229

  9. International Space Station Increment-3 Microgravity Environment Summary Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy; Grodsinksy, Carlos

    2002-01-01

    This summary report presents the results of some of the processed acceleration data measured aboard the International Space Station during the period of August to December 2001. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-3. However, not all of the activities were analyzed for this report due to time constraint and lack of precise timeline information regarding some payload operations and station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification. The International Space Station Increment-3 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: (1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. (2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-3 from August to December, 2001.

  10. A support vector machine approach for classification of welding defects from ultrasonic signals

    NASA Astrophysics Data System (ADS)

    Chen, Yuan; Ma, Hong-Wei; Zhang, Guang-Ming

    2014-07-01

    Defect classification is an important issue in ultrasonic non-destructive evaluation. A layered multi-class support vector machine (LMSVM) classification system, which combines multiple SVM classifiers through a layered architecture, is proposed in this paper. The proposed LMSVM classification system is applied to the classification of welding defects from ultrasonic test signals. The measured ultrasonic defect echo signals are first decomposed into wavelet coefficients by the wavelet packet transform. The energy of the wavelet coefficients at different frequency channels are used to construct the feature vectors. The bees algorithm (BA) is then used for feature selection and SVM parameter optimisation for the LMSVM classification system. The BA-based feature selection optimises the energy feature vectors. The optimised feature vectors are input to the LMSVM classification system for training and testing. Experimental results of classifying welding defects demonstrate that the proposed technique is highly robust, precise and reliable for ultrasonic defect classification.

  11. Support vector machine based decision for mechanical fault condition monitoring in induction motor using an advanced Hilbert-Park transform.

    PubMed

    Ben Salem, Samira; Bacha, Khmais; Chaari, Abdelkader

    2012-09-01

    In this work we suggest an original fault signature based on an improved combination of Hilbert and Park transforms. Starting from this combination we can create two fault signatures: Hilbert modulus current space vector (HMCSV) and Hilbert phase current space vector (HPCSV). These two fault signatures are subsequently analysed using the classical fast Fourier transform (FFT). The effects of mechanical faults on the HMCSV and HPCSV spectrums are described, and the related frequencies are determined. The magnitudes of spectral components, relative to the studied faults (air-gap eccentricity and outer raceway ball bearing defect), are extracted in order to develop the input vector necessary for learning and testing the support vector machine with an aim of classifying automatically the various states of the induction motor. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Incremental Validity of the Durand Adaptive Psychopathic Traits Questionnaire Above Self-Report Psychopathy Measures in Community Samples.

    PubMed

    Durand, Guillaume

    2018-05-03

    Although highly debated, the notion of the existence of an adaptive side to psychopathy is supported by some researchers. Currently, 2 instruments assessing psychopathic traits include an adaptive component, which might not cover the full spectrum of adaptive psychopathic traits. The Durand Adaptive Psychopathic Traits Questionnaire (DAPTQ; Durand, 2017 ) is a 41-item self-reported instrument assessing adaptive traits known to correlate with the psychopathic personality. In this study, I investigated in 2 samples (N = 263 and N = 262) the incremental validity of the DAPTQ over the Psychopathic Personality Inventory-Short Form (PPI-SF) and the Triarchic Psychopathy Measure (TriPM) using multiple criterion measures. Results showed that the DAPTQ significantly increased the predictive validity over the PPI-SF on 5 factors of the HEXACO. Additionally, the DAPTQ provided incremental validity over both the PPI-SF and the TriPM on measures of communication adaptability, perceived stress, and trait anxiety. Overall, these results support the validity of the DAPTQ in community samples. Directions for future studies to further validate the DAPTQ are discussed.

  13. Validation of Growth Layer Group (GLG) depositional rate using daily incremental growth lines in the dentin of beluga (Delphinapterus leucas (Pallas, 1776)) teeth

    PubMed Central

    Suydam, Robert S.; Ortiz, Joseph D.; Thewissen, J. G. M.

    2018-01-01

    Counts of Growth Layer Groups (GLGs) in the dentin of marine mammal teeth are widely used as indicators of age. In most marine mammals, observations document that GLGs are deposited yearly, but in beluga whales, some studies have supported the view that two GLGs are deposited each year. Our understanding of beluga life-history differs substantially depending on assumptions regarding the timing of GLG deposition; therefore, resolving this issue has important considerations for population assessments. In this study, we used incremental lines that represent daily pulses of dentin mineralization to test the hypothesis that GLGs in beluga dentin are deposited on a yearly basis. Our estimate of the number of daily growth lines within one GLG is remarkably close to 365 days within error, supporting the hypothesis that GLGs are deposited annually in beluga. We show that measurement of daily growth increments can be used to validate the time represented by GLGs in beluga. Furthermore, we believe this methodology may have broader applications to age estimation in other taxa. PMID:29338011

  14. Validation of Growth Layer Group (GLG) depositional rate using daily incremental growth lines in the dentin of beluga (Delphinapterus leucas (Pallas, 1776)) teeth.

    PubMed

    Waugh, David A; Suydam, Robert S; Ortiz, Joseph D; Thewissen, J G M

    2018-01-01

    Counts of Growth Layer Groups (GLGs) in the dentin of marine mammal teeth are widely used as indicators of age. In most marine mammals, observations document that GLGs are deposited yearly, but in beluga whales, some studies have supported the view that two GLGs are deposited each year. Our understanding of beluga life-history differs substantially depending on assumptions regarding the timing of GLG deposition; therefore, resolving this issue has important considerations for population assessments. In this study, we used incremental lines that represent daily pulses of dentin mineralization to test the hypothesis that GLGs in beluga dentin are deposited on a yearly basis. Our estimate of the number of daily growth lines within one GLG is remarkably close to 365 days within error, supporting the hypothesis that GLGs are deposited annually in beluga. We show that measurement of daily growth increments can be used to validate the time represented by GLGs in beluga. Furthermore, we believe this methodology may have broader applications to age estimation in other taxa.

  15. Bayesian data assimilation provides rapid decision support for vector-borne diseases.

    PubMed

    Jewell, Chris P; Brown, Richard G

    2015-07-06

    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Although host population data are typically available, for novel disease introductions there is a high chance of the pathogen using a vector for which data are unavailable. This presents a barrier to estimating the parameters of dynamical models representing host-vector-pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds and provide evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Support Vector Machines: Relevance Feedback and Information Retrieval.

    ERIC Educational Resources Information Center

    Drucker, Harris; Shahrary, Behzad; Gibbon, David C.

    2002-01-01

    Compares support vector machines (SVMs) to Rocchio, Ide regular and Ide dec-hi algorithms in information retrieval (IR) of text documents using relevancy feedback. If the preliminary search is so poor that one has to search through many documents to find at least one relevant document, then SVM is preferred. Includes nine tables. (Contains 24…

  17. Subpixel urban land cover estimation: comparing cubist, random forests, and support vector regression

    Treesearch

    Jeffrey T. Walton

    2008-01-01

    Three machine learning subpixel estimation methods (Cubist, Random Forests, and support vector regression) were applied to estimate urban cover. Urban forest canopy cover and impervious surface cover were estimated from Landsat-7 ETM+ imagery using a higher resolution cover map resampled to 30 m as training and reference data. Three different band combinations (...

  18. Applications of Support Vector Machine (SVM) Learning in Cancer Genomics.

    PubMed

    Huang, Shujun; Cai, Nianguang; Pacheco, Pedro Penzuti; Narrandes, Shavira; Wang, Yang; Xu, Wayne

    2018-01-01

    Machine learning with maximization (support) of separating margin (vector), called support vector machine (SVM) learning, is a powerful classification tool that has been used for cancer genomic classification or subtyping. Today, as advancements in high-throughput technologies lead to production of large amounts of genomic and epigenomic data, the classification feature of SVMs is expanding its use in cancer genomics, leading to the discovery of new biomarkers, new drug targets, and a better understanding of cancer driver genes. Herein we reviewed the recent progress of SVMs in cancer genomic studies. We intend to comprehend the strength of the SVM learning and its future perspective in cancer genomic applications. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  19. Dual linear structured support vector machine tracking method via scale correlation filter

    NASA Astrophysics Data System (ADS)

    Li, Weisheng; Chen, Yanquan; Xiao, Bin; Feng, Chen

    2018-01-01

    Adaptive tracking-by-detection methods based on structured support vector machine (SVM) performed well on recent visual tracking benchmarks. However, these methods did not adopt an effective strategy of object scale estimation, which limits the overall tracking performance. We present a tracking method based on a dual linear structured support vector machine (DLSSVM) with a discriminative scale correlation filter. The collaborative tracker comprised of a DLSSVM model and a scale correlation filter obtains good results in tracking target position and scale estimation. The fast Fourier transform is applied for detection. Extensive experiments show that our tracking approach outperforms many popular top-ranking trackers. On a benchmark including 100 challenging video sequences, the average precision of the proposed method is 82.8%.

  20. Object recognition of ladar with support vector machine

    NASA Astrophysics Data System (ADS)

    Sun, Jian-Feng; Li, Qi; Wang, Qi

    2005-01-01

    Intensity, range and Doppler images can be obtained by using laser radar. Laser radar can detect much more object information than other detecting sensor, such as passive infrared imaging and synthetic aperture radar (SAR), so it is well suited as the sensor of object recognition. Traditional method of laser radar object recognition is extracting target features, which can be influenced by noise. In this paper, a laser radar recognition method-Support Vector Machine is introduced. Support Vector Machine (SVM) is a new hotspot of recognition research after neural network. It has well performance on digital written and face recognition. Two series experiments about SVM designed for preprocessing and non-preprocessing samples are performed by real laser radar images, and the experiments results are compared.

  1. nu-Anomica: A Fast Support Vector Based Novelty Detection Technique

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Bhaduri, Kanishka; Oza, Nikunj C.; Srivastava, Ashok N.

    2009-01-01

    In this paper we propose nu-Anomica, a novel anomaly detection technique that can be trained on huge data sets with much reduced running time compared to the benchmark one-class Support Vector Machines algorithm. In -Anomica, the idea is to train the machine such that it can provide a close approximation to the exact decision plane using fewer training points and without losing much of the generalization performance of the classical approach. We have tested the proposed algorithm on a variety of continuous data sets under different conditions. We show that under all test conditions the developed procedure closely preserves the accuracy of standard one-class Support Vector Machines while reducing both the training time and the test time by 5 - 20 times.

  2. ℓ p-Norm Multikernel Learning Approach for Stock Market Price Forecasting

    PubMed Central

    Shao, Xigao; Wu, Kun; Liao, Bifeng

    2012-01-01

    Linear multiple kernel learning model has been used for predicting financial time series. However, ℓ 1-norm multiple support vector regression is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernel mixtures that generalize well, we adopt ℓ p-norm multiple kernel support vector regression (1 ≤ p < ∞) as a stock price prediction model. The optimization problem is decomposed into smaller subproblems, and the interleaved optimization strategy is employed to solve the regression model. The model is evaluated on forecasting the daily stock closing prices of Shanghai Stock Index in China. Experimental results show that our proposed model performs better than ℓ 1-norm multiple support vector regression model. PMID:23365561

  3. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  4. Design of 2D time-varying vector fields.

    PubMed

    Chen, Guoning; Kwatra, Vivek; Wei, Li-Yi; Hansen, Charles D; Zhang, Eugene

    2012-10-01

    Design of time-varying vector fields, i.e., vector fields that can change over time, has a wide variety of important applications in computer graphics. Existing vector field design techniques do not address time-varying vector fields. In this paper, we present a framework for the design of time-varying vector fields, both for planar domains as well as manifold surfaces. Our system supports the creation and modification of various time-varying vector fields with desired spatial and temporal characteristics through several design metaphors, including streamlines, pathlines, singularity paths, and bifurcations. These design metaphors are integrated into an element-based design to generate the time-varying vector fields via a sequence of basis field summations or spatial constrained optimizations at the sampled times. The key-frame design and field deformation are also introduced to support other user design scenarios. Accordingly, a spatial-temporal constrained optimization and the time-varying transformation are employed to generate the desired fields for these two design scenarios, respectively. We apply the time-varying vector fields generated using our design system to a number of important computer graphics applications that require controllable dynamic effects, such as evolving surface appearance, dynamic scene design, steerable crowd movement, and painterly animation. Many of these are difficult or impossible to achieve via prior simulation-based methods. In these applications, the time-varying vector fields have been applied as either orientation fields or advection fields to control the instantaneous appearance or evolving trajectories of the dynamic effects.

  5. Robot-based additive manufacturing for flexible die-modelling in incremental sheet forming

    NASA Astrophysics Data System (ADS)

    Rieger, Michael; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd

    2017-10-01

    The paper describes the application concept of additive manufactured dies to support the robot-based incremental sheet metal forming process (`Roboforming') for the production of sheet metal components in small batch sizes. Compared to the dieless kinematic-based generation of a shape by means of two cooperating industrial robots, the supporting robot models a die on the back of the metal sheet by using the robot-based fused layer manufacturing process (FLM). This tool chain is software-defined and preserves the high geometrical form flexibility of Roboforming while flexibly generating support structures adapted to the final part's geometry. Test series serve to confirm the feasibility of the concept by investigating the process challenges of the adhesion to the sheet surface and the general stability as well as the influence on the geometric accuracy compared to the well-known forming strategies.

  6. Techniques utilized in the simulated altitude testing of a 2D-CD vectoring and reversing nozzle

    NASA Technical Reports Server (NTRS)

    Block, H. Bruce; Bryant, Lively; Dicus, John H.; Moore, Allan S.; Burns, Maureen E.; Solomon, Robert F.; Sheer, Irving

    1988-01-01

    Simulated altitude testing of a two-dimensional, convergent-divergent, thrust vectoring and reversing exhaust nozzle was accomplished. An important objective of this test was to develop test hardware and techniques to properly operate a vectoring and reversing nozzle within the confines of an altitude test facility. This report presents detailed information on the major test support systems utilized, the operational performance of the systems and the problems encountered, and test equipment improvements recommended for future tests. The most challenging support systems included the multi-axis thrust measurement system, vectored and reverse exhaust gas collection systems, and infrared temperature measurement systems used to evaluate and monitor the nozzle. The feasibility of testing a vectoring and reversing nozzle of this type in an altitude chamber was successfully demonstrated. Supporting systems performed as required. During reverser operation, engine exhaust gases were successfully captured and turned downstream. However, a small amount of exhaust gas spilled out the collector ducts' inlet openings when the reverser was opened more than 60 percent. The spillage did not affect engine or nozzle performance. The three infrared systems which viewed the nozzle through the exhaust collection system worked remarkably well considering the harsh environment.

  7. Adenovirus Vectors Target Several Cell Subtypes of Mammalian Inner Ear In Vivo

    PubMed Central

    Li, Wenyan; Shen, Jun

    2016-01-01

    Mammalian inner ear harbors diverse cell types that are essential for hearing and balance. Adenovirus is one of the major vectors to deliver genes into the inner ear for functional studies and hair cell regeneration. To identify adenovirus vectors that target specific cell subtypes in the inner ear, we studied three adenovirus vectors, carrying a reporter gene encoding green fluorescent protein (GFP) from two vendors or with a genome editing gene Cre recombinase (Cre), by injection into postnatal days 0 (P0) and 4 (P4) mouse cochlea through scala media by cochleostomy in vivo. We found three adenovirus vectors transduced mouse inner ear cells with different specificities and expression levels, depending on the type of adenoviral vectors and the age of mice. The most frequently targeted region was the cochlear sensory epithelium, including auditory hair cells and supporting cells. Adenovirus with GFP transduced utricular supporting cells as well. This study shows that adenovirus vectors are capable of efficiently and specifically transducing different cell types in the mammalian inner ear and provides useful tools to study inner ear gene function and to evaluate gene therapy to treat hearing loss and vestibular dysfunction. PMID:28116172

  8. Bayesian data assimilation provides rapid decision support for vector-borne diseases

    PubMed Central

    Jewell, Chris P.; Brown, Richard G.

    2015-01-01

    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Although host population data are typically available, for novel disease introductions there is a high chance of the pathogen using a vector for which data are unavailable. This presents a barrier to estimating the parameters of dynamical models representing host–vector–pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds and provide evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks. PMID:26136225

  9. Incremental comprehension of spoken quantifier sentences: Evidence from brain potentials.

    PubMed

    Freunberger, Dominik; Nieuwland, Mante S

    2016-09-01

    Do people incrementally incorporate the meaning of quantifier expressions to understand an unfolding sentence? Most previous studies concluded that quantifiers do not immediately influence how a sentence is understood based on the observation that online N400-effects differed from offline plausibility judgments. Those studies, however, used serial visual presentation (SVP), which involves unnatural reading. In the current ERP-experiment, we presented spoken positive and negative quantifier sentences ("Practically all/practically no postmen prefer delivering mail, when the weather is good/bad during the day"). Different from results obtained in a previously reported SVP-study (Nieuwland, 2016) sentence truth-value N400 effects occurred in positive and negative quantifier sentences alike, reflecting fully incremental quantifier comprehension. This suggests that the prosodic information available during spoken language comprehension supports the generation of online predictions for upcoming words and that, at least for quantifier sentences, comprehension of spoken language may proceed more incrementally than comprehension during SVP reading. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Stable Local Volatility Calibration Using Kernel Splines

    NASA Astrophysics Data System (ADS)

    Coleman, Thomas F.; Li, Yuying; Wang, Cheng

    2010-09-01

    We propose an optimization formulation using L1 norm to ensure accuracy and stability in calibrating a local volatility function for option pricing. Using a regularization parameter, the proposed objective function balances the calibration accuracy with the model complexity. Motivated by the support vector machine learning, the unknown local volatility function is represented by a kernel function generating splines and the model complexity is controlled by minimizing the 1-norm of the kernel coefficient vector. In the context of the support vector regression for function estimation based on a finite set of observations, this corresponds to minimizing the number of support vectors for predictability. We illustrate the ability of the proposed approach to reconstruct the local volatility function in a synthetic market. In addition, based on S&P 500 market index option data, we demonstrate that the calibrated local volatility surface is simple and resembles the observed implied volatility surface in shape. Stability is illustrated by calibrating local volatility functions using market option data from different dates.

  11. Public Key Infrastructure (PKI) Increment 2 Root Cause Analysis (RCA) for Performance Assessments and Root Cause Analyses (PARCA)

    DTIC Science & Technology

    2015-05-01

    for issuing this critical change:  Inability to achieve PKI Increment 2 Full Deployment Decision ( FDD ) within five years of program initiation...March 1, 2014 deadline), and  Delay of over one year in the original FDD estimate provided to the Congress (1 March 2014 deadline). The proximate...to support a 1 March 2014 FDD .” The Director, Performance Assessments and Root Cause Analyses (PARCA), asked the Institute for Defense Analyses

  12. Scaling Up of an Innovative Intervention to Reduce Risk of Dengue, Chikungunya, and Zika Transmission in Uruguay in the Framework of an Intersectoral Approach with and without Community Participation

    PubMed Central

    Basso, César; García da Rosa, Elsa; Lairihoy, Rosario; Caffera, Ruben M.; Roche, Ingrid; González, Cristina; da Rosa, Ricardo; Gularte, Alexis; Alfonso-Sierra, Eduardo; Petzold, Max; Kroeger, Axel; Sommerfeld, Johannes

    2017-01-01

    Abstract. To contribute to the prevention of dengue, chikungunya, and Zika, a process of scaling up an innovative intervention to reduce Aedes aegypti habitats, was carried out in the city of Salto (Uruguay) based on a transdisciplinary analysis of the eco-bio-social determinants. The intervention in one-third of the city included the distributions of plastic bags for all households to collect all discarded water containers that were recollected by the Ministry of Health and the Municipality vector control services. The results were evaluated in 20 randomly assigned clusters of 100 households each, in the intervention and control arm. The intervention resulted in a significantly larger decrease in the number of pupae per person index (as a proxy for adult vector abundance) than the corresponding decrease in the control areas (both areas decreased by winter effects). The reduction of intervention costs (“incremental costs”) in relation to routine vector control activities was 46%. Community participation increased the collaboration with the intervention program considerably (from 48% of bags handed back out of the total of bags delivered to 59% of bags handed back). Although the costs increased by 26% compared with intervention without community participation, the acceptability of actions by residents increased from 66% to 78%. PMID:28820690

  13. Scaling Up of an Innovative Intervention to Reduce Risk of Dengue, Chikungunya, and Zika Transmission in Uruguay in the Framework of an Intersectoral Approach with and without Community Participation.

    PubMed

    Basso, César; García da Rosa, Elsa; Lairihoy, Rosario; Caffera, Ruben M; Roche, Ingrid; González, Cristina; da Rosa, Ricardo; Gularte, Alexis; Alfonso-Sierra, Eduardo; Petzold, Max; Kroeger, Axel; Sommerfeld, Johannes

    2017-11-01

    To contribute to the prevention of dengue, chikungunya, and Zika, a process of scaling up an innovative intervention to reduce Aedes aegypti habitats, was carried out in the city of Salto (Uruguay) based on a transdisciplinary analysis of the eco-bio-social determinants. The intervention in one-third of the city included the distributions of plastic bags for all households to collect all discarded water containers that were recollected by the Ministry of Health and the Municipality vector control services. The results were evaluated in 20 randomly assigned clusters of 100 households each, in the intervention and control arm. The intervention resulted in a significantly larger decrease in the number of pupae per person index (as a proxy for adult vector abundance) than the corresponding decrease in the control areas (both areas decreased by winter effects). The reduction of intervention costs ("incremental costs") in relation to routine vector control activities was 46%. Community participation increased the collaboration with the intervention program considerably (from 48% of bags handed back out of the total of bags delivered to 59% of bags handed back). Although the costs increased by 26% compared with intervention without community participation, the acceptability of actions by residents increased from 66% to 78%.

  14. Estimation of Teacher Practices Based on Text Transcripts of Teacher Speech Using a Support Vector Machine Algorithm

    ERIC Educational Resources Information Center

    Araya, Roberto; Plana, Francisco; Dartnell, Pablo; Soto-Andrade, Jorge; Luci, Gina; Salinas, Elena; Araya, Marylen

    2012-01-01

    Teacher practice is normally assessed by observers who watch classes or videos of classes. Here, we analyse an alternative strategy that uses text transcripts and a support vector machine classifier. For each one of the 710 videos of mathematics classes from the 2005 Chilean National Teacher Assessment Programme, a single 4-minute slice was…

  15. A new technique for the characterization of chaff elements

    NASA Astrophysics Data System (ADS)

    Scholfield, David; Myat, Maung; Dauby, Jason; Fesler, Jonathon; Bright, Jonathan

    2011-07-01

    A new technique for the experimental characterization of electromagnetic chaff based on Inverse Synthetic Aperture Radar is presented. This technique allows for the characterization of as few as one filament of chaff in a controlled anechoic environment allowing for stability and repeatability of experimental results. This approach allows for a deeper understanding of the fundamental phenomena of electromagnetic scattering from chaff through an incremental analysis approach. Chaff analysis can now begin with a single element and progress through the build-up of particles into pseudo-cloud structures. This controlled incremental approach is supported by an identical incremental modeling and validation process. Additionally, this technique has the potential to produce considerable savings in financial and schedule cost and provides a stable and repeatable experiment to aid model valuation.

  16. Application of Machine Learning in Postural Control Kinematics for the Diagnosis of Alzheimer's Disease

    PubMed Central

    Yelshyna, Darya; Bicho, Estela

    2016-01-01

    The use of wearable devices to study gait and postural control is a growing field on neurodegenerative disorders such as Alzheimer's disease (AD). In this paper, we investigate if machine-learning classifiers offer the discriminative power for the diagnosis of AD based on postural control kinematics. We compared Support Vector Machines (SVMs), Multiple Layer Perceptrons (MLPs), Radial Basis Function Neural Networks (RBNs), and Deep Belief Networks (DBNs) on 72 participants (36 AD patients and 36 healthy subjects) exposed to seven increasingly difficult postural tasks. The decisional space was composed of 18 kinematic variables (adjusted for age, education, height, and weight), with or without neuropsychological evaluation (Montreal cognitive assessment (MoCA) score), top ranked in an error incremental analysis. Classification results were based on threefold cross validation of 50 independent and randomized runs sets: training (50%), test (40%), and validation (10%). Having a decisional space relying solely on postural kinematics, accuracy of AD diagnosis ranged from 71.7 to 86.1%. Adding the MoCA variable, the accuracy ranged between 91 and 96.6%. MLP classifier achieved top performance in both decisional spaces. Having comprehended the interdynamic interaction between postural stability and cognitive performance, our results endorse machine-learning models as a useful tool for computer-aided diagnosis of AD based on postural control kinematics. PMID:28074090

  17. Application of Machine Learning in Postural Control Kinematics for the Diagnosis of Alzheimer's Disease.

    PubMed

    Costa, Luís; Gago, Miguel F; Yelshyna, Darya; Ferreira, Jaime; David Silva, Hélder; Rocha, Luís; Sousa, Nuno; Bicho, Estela

    2016-01-01

    The use of wearable devices to study gait and postural control is a growing field on neurodegenerative disorders such as Alzheimer's disease (AD). In this paper, we investigate if machine-learning classifiers offer the discriminative power for the diagnosis of AD based on postural control kinematics. We compared Support Vector Machines (SVMs), Multiple Layer Perceptrons (MLPs), Radial Basis Function Neural Networks (RBNs), and Deep Belief Networks (DBNs) on 72 participants (36 AD patients and 36 healthy subjects) exposed to seven increasingly difficult postural tasks. The decisional space was composed of 18 kinematic variables (adjusted for age, education, height, and weight), with or without neuropsychological evaluation (Montreal cognitive assessment (MoCA) score), top ranked in an error incremental analysis. Classification results were based on threefold cross validation of 50 independent and randomized runs sets: training (50%), test (40%), and validation (10%). Having a decisional space relying solely on postural kinematics, accuracy of AD diagnosis ranged from 71.7 to 86.1%. Adding the MoCA variable, the accuracy ranged between 91 and 96.6%. MLP classifier achieved top performance in both decisional spaces. Having comprehended the interdynamic interaction between postural stability and cognitive performance, our results endorse machine-learning models as a useful tool for computer-aided diagnosis of AD based on postural control kinematics.

  18. Prediction of lysine glutarylation sites by maximum relevance minimum redundancy feature selection.

    PubMed

    Ju, Zhe; He, Jian-Jun

    2018-06-01

    Lysine glutarylation is new type of protein acylation modification in both prokaryotes and eukaryotes. To better understand the molecular mechanism of glutarylation, it is important to identify glutarylated substrates and their corresponding glutarylation sites accurately. In this study, a novel bioinformatics tool named GlutPred is developed to predict glutarylation sites by using multiple feature extraction and maximum relevance minimum redundancy feature selection. On the one hand, amino acid factors, binary encoding, and the composition of k-spaced amino acid pairs features are incorporated to encode glutarylation sites. And the maximum relevance minimum redundancy method and the incremental feature selection algorithm are adopted to remove the redundant features. On the other hand, a biased support vector machine algorithm is used to handle the imbalanced problem in glutarylation sites training dataset. As illustrated by 10-fold cross-validation, the performance of GlutPred achieves a satisfactory performance with a Sensitivity of 64.80%, a Specificity of 76.60%, an Accuracy of 74.90% and a Matthew's correlation coefficient of 0.3194. Feature analysis shows that some k-spaced amino acid pair features play the most important roles in the prediction of glutarylation sites. The conclusions derived from this study might provide some clues for understanding the molecular mechanisms of glutarylation. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. 49 CFR Appendix A to Part 611 - Description of Measures Used for Project Evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... within 1/2-mile of boarding points associated with the proposed system increment. (b) Environmental... in British Thermal Units (BTU), compared to the baseline alternative; and (3) Current Environmental...) Transit-supportive corridor policies; (5) Supportive zoning regulations near transit stations; (6) Tools...

  20. Time management situation assessment (TMSA)

    NASA Technical Reports Server (NTRS)

    Richardson, Michael B.; Ricci, Mark J.

    1992-01-01

    TMSA is a concept prototype developed to support NASA Test Directors (NTDs) in schedule execution monitoring during the later stages of a Shuttle countdown. The program detects qualitative and quantitative constraint violations in near real-time. The next version will support incremental rescheduling and reason over a substantially larger number of scheduled events.

  1. Fast support vector data descriptions for novelty detection.

    PubMed

    Liu, Yi-Hung; Liu, Yan-Chen; Chen, Yen-Jen

    2010-08-01

    Support vector data description (SVDD) has become a very attractive kernel method due to its good results in many novelty detection problems. However, the decision function of SVDD is expressed in terms of the kernel expansion, which results in a run-time complexity linear in the number of support vectors. For applications where fast real-time response is needed, how to speed up the decision function is crucial. This paper aims at dealing with the issue of reducing the testing time complexity of SVDD. A method called fast SVDD (F-SVDD) is proposed. Unlike the traditional methods which all try to compress a kernel expansion into one with fewer terms, the proposed F-SVDD directly finds the preimage of a feature vector, and then uses a simple relationship between this feature vector and the SVDD sphere center to re-express the center with a single vector. The decision function of F-SVDD contains only one kernel term, and thus the decision boundary of F-SVDD is only spherical in the original space. Hence, the run-time complexity of the F-SVDD decision function is no longer linear in the support vectors, but is a constant, no matter how large the training set size is. In this paper, we also propose a novel direct preimage-finding method, which is noniterative and involves no free parameters. The unique preimage can be obtained in real time by the proposed direct method without taking trial-and-error. For demonstration, several real-world data sets and a large-scale data set, the extended MIT face data set, are used in experiments. In addition, a practical industry example regarding liquid crystal display micro-defect inspection is also used to compare the applicability of SVDD and our proposed F-SVDD when faced with mass data input. The results are very encouraging.

  2. Support Vector Machine-Based Endmember Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Filippi, Anthony M; Archibald, Richard K

    Introduced in this paper is the utilization of Support Vector Machines (SVMs) to automatically perform endmember extraction from hyperspectral data. The strengths of SVM are exploited to provide a fast and accurate calculated representation of high-dimensional data sets that may consist of multiple distributions. Once this representation is computed, the number of distributions can be determined without prior knowledge. For each distribution, an optimal transform can be determined that preserves informational content while reducing the data dimensionality, and hence, the computational cost. Finally, endmember extraction for the whole data set is accomplished. Results indicate that this Support Vector Machine-Based Endmembermore » Extraction (SVM-BEE) algorithm has the capability of autonomously determining endmembers from multiple clusters with computational speed and accuracy, while maintaining a robust tolerance to noise.« less

  3. Number needed to treat and costs per responder among biologic treatments for moderate-to-severe psoriasis: a network meta-analysis.

    PubMed

    Armstrong, April W; Betts, Keith A; Signorovitch, James E; Sundaram, Murali; Li, Junlong; Ganguli, Arijit X; Wu, Eric Q

    2018-04-23

    The clinical benefits of biologic therapies for moderate-to-severe psoriasis are well established, but wide variations exist in patient response. To determine the number needed to treat (NNT) to achieve a 75% and 90% reduction in the Psoriasis Area and Severity Index (PASI-75/90) with FDA-approved agents and evaluate the incremental cost per PASI-75 or PASI-90 responder. The relative probabilities of achieving PASI-75 and PASI-90, as well as NNTs, were estimated using a network meta-analysis. Costs (2017 USD) included drug acquisition and administration. The incremental cost per PASI-75 or PASI-90 responder for each treatment was estimated for the clinical trial period, and annually. Compared with supportive care, the NNT to achieve PASI-75 was 1.18 for ixekizumab, 1.29 for secukinumab 300 mg, 1.37 for infliximab, 1.48 for adalimumab, 1.53 for secukinumab 150 mg, 1.58 for ustekinumab, 2.25 for etanercept, and 3.71 for apremilast. The one-year incremental cost per PASI-75 responder relative to supportive care was $59,830 for infliximab, $88,775 for secukinumab 300 mg, $91,837 for adalimumab, $95,898 for ixekizumab, $97,363 for ustekinumab, $105,131 for secukinumab 150 mg, $129,665 for apremilast, and $159,328 for etanercept. Results were similar for PASI-90. The NNT and incremental cost per responder are meaningful ways to assess comparative effectiveness and cost effectiveness among psoriasis treatments.

  4. Estimating the costs of supporting safety-net transformation into patient-centered medical homes in post-Katrina New Orleans.

    PubMed

    Shao, Hui; Brown, Lisanne; Diana, Mark L; Schmidt, Laura A; Mason, Karen; Oronce, Carlos Irwin; Shi, Lizheng

    2016-09-01

    There is a need to understand the costs associated with supporting, implementing, and maintaining the system redesign of small and medium-sized safety-net clinics. The authors aimed to understand the characteristics of clinics that transformed into patient-centered medical homes and the incremental cost for transformation.The sample was 74 clinics in Greater New Orleans that received funds from the Primary Care Access and Stabilization Grant program between 2007 and 2010 to support their transformation. The study period was divided into baseline (September 21, 2007-March 21, 2008), transformation (March 22, 2008-March 21, 2009), and maintenance (March 22, 2009-September 20, 2010) periods, and data were collected at 6-month intervals. Baseline characteristics for the clinics that transformed were compared to those that did not. Fixed-effect models were conducted for cost estimation, controlling for baseline differences, using propensity score weights.Half of the 74 primary care clinics achieved transformation by the end of the study period. The clinics that transformed had higher total cost, more clinic visits, and a larger female patient proportion at baseline. The estimated incremental cost for clinics that underwent transformation was $37.61 per visit per 6 months, and overall it cost $24.86 per visit per 6 months in grant funds to support a clinic's transformation.Larger-sized clinics and those with a higher female proportion were more likely to transform. The Primary Care Access and Stabilization Grant program provided approximately $24.86 per visit over the 2 and 1/2 years. This estimated incremental cost could be used to guide policy recommendations to support primary care transformation in the United States.

  5. Estimating the costs of supporting safety-net transformation into patient-centered medical homes in post-Katrina New Orleans

    PubMed Central

    Shao, Hui; Brown, Lisanne; Diana, Mark L.; Schmidt, Laura A.; Mason, Karen; Oronce, Carlos Irwin; Shi, Lizheng

    2016-01-01

    Abstract There is a need to understand the costs associated with supporting, implementing, and maintaining the system redesign of small and medium-sized safety-net clinics. The authors aimed to understand the characteristics of clinics that transformed into patient-centered medical homes and the incremental cost for transformation. The sample was 74 clinics in Greater New Orleans that received funds from the Primary Care Access and Stabilization Grant program between 2007 and 2010 to support their transformation. The study period was divided into baseline (September 21, 2007–March 21, 2008), transformation (March 22, 2008–March 21, 2009), and maintenance (March 22, 2009–September 20, 2010) periods, and data were collected at 6-month intervals. Baseline characteristics for the clinics that transformed were compared to those that did not. Fixed-effect models were conducted for cost estimation, controlling for baseline differences, using propensity score weights. Half of the 74 primary care clinics achieved transformation by the end of the study period. The clinics that transformed had higher total cost, more clinic visits, and a larger female patient proportion at baseline. The estimated incremental cost for clinics that underwent transformation was $37.61 per visit per 6 months, and overall it cost $24.86 per visit per 6 months in grant funds to support a clinic's transformation. Larger-sized clinics and those with a higher female proportion were more likely to transform. The Primary Care Access and Stabilization Grant program provided approximately $24.86 per visit over the 2 and 1/2 years. This estimated incremental cost could be used to guide policy recommendations to support primary care transformation in the United States. PMID:27684855

  6. Predicting domain-domain interaction based on domain profiles with feature selection and support vector machines

    PubMed Central

    2010-01-01

    Background Protein-protein interaction (PPI) plays essential roles in cellular functions. The cost, time and other limitations associated with the current experimental methods have motivated the development of computational methods for predicting PPIs. As protein interactions generally occur via domains instead of the whole molecules, predicting domain-domain interaction (DDI) is an important step toward PPI prediction. Computational methods developed so far have utilized information from various sources at different levels, from primary sequences, to molecular structures, to evolutionary profiles. Results In this paper, we propose a computational method to predict DDI using support vector machines (SVMs), based on domains represented as interaction profile hidden Markov models (ipHMM) where interacting residues in domains are explicitly modeled according to the three dimensional structural information available at the Protein Data Bank (PDB). Features about the domains are extracted first as the Fisher scores derived from the ipHMM and then selected using singular value decomposition (SVD). Domain pairs are represented by concatenating their selected feature vectors, and classified by a support vector machine trained on these feature vectors. The method is tested by leave-one-out cross validation experiments with a set of interacting protein pairs adopted from the 3DID database. The prediction accuracy has shown significant improvement as compared to InterPreTS (Interaction Prediction through Tertiary Structure), an existing method for PPI prediction that also uses the sequences and complexes of known 3D structure. Conclusions We show that domain-domain interaction prediction can be significantly enhanced by exploiting information inherent in the domain profiles via feature selection based on Fisher scores, singular value decomposition and supervised learning based on support vector machines. Datasets and source code are freely available on the web at http://liao.cis.udel.edu/pub/svdsvm. Implemented in Matlab and supported on Linux and MS Windows. PMID:21034480

  7. 50 CFR 403.02 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... largest supportable within the ecosystem to the population level that results in maximum net productivity. Maximum net productivity is the greatest net annual increment in population numbers or biomass resulting...

  8. Constraining primordial vector mode from B-mode polarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saga, Shohei; Ichiki, Kiyotomo; Shiraishi, Maresuke, E-mail: saga.shohei@nagoya-u.jp, E-mail: maresuke.shiraishi@pd.infn.it, E-mail: ichiki@a.phys.nagoya-u.ac.jp

    The B-mode polarization spectrum of the Cosmic Microwave Background (CMB) may be the smoking gun of not only the primordial tensor mode but also of the primordial vector mode. If there exist nonzero vector-mode metric perturbations in the early Universe, they are known to be supported by anisotropic stress fluctuations of free-streaming particles such as neutrinos, and to create characteristic signatures on both the CMB temperature, E-mode, and B-mode polarization anisotropies. We place constraints on the properties of the primordial vector mode characterized by the vector-to-scalar ratio r{sub v} and the spectral index n{sub v} of the vector-shear power spectrum,more » from the Planck and BICEP2 B-mode data. We find that, for scale-invariant initial spectra, the ΛCDM model including the vector mode fits the data better than the model including the tensor mode. The difference in χ{sup 2} between the vector and tensor models is Δχ{sup 2} = 3.294, because, on large scales the vector mode generates smaller temperature fluctuations than the tensor mode, which is preferred for the data. In contrast, the tensor mode can fit the data set equally well if we allow a significantly blue-tilted spectrum. We find that the best-fitting tensor mode has a large blue tilt and leads to an indistinct reionization bump on larger angular scales. The slightly red-tilted vector mode supported by the current data set can also create O(10{sup -22})-Gauss magnetic fields at cosmological recombination. Our constraints should motivate research that considers models of the early Universe that involve the vector mode.« less

  9. International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System Keep Out Zone On-Orbit Problems

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2004-01-01

    The International Space Station (ISS) Environmental Control and Life Support (ECLS) system performance can be impacted by operations on ISS. This is especially important for the Temperature and Humidity Control (THC) and for the Fire Detection and Suppression (FDS) subsystems. It is also more important for Node 1 since it has become a convenient area for many crew tasks and for stowing hardware prior to Shuttle arrival. This paper will discuss the current requirements for ECLS keep out zones in Node 1; the issues with stowage in Node 1 during Increment 7 and how they impacted the keep out zone requirements; and the solution during Increment 7 and 8 for maintaining the keep out zones in Node 1.

  10. Curriculum Assessment Using Artificial Neural Network and Support Vector Machine Modeling Approaches: A Case Study. IR Applications. Volume 29

    ERIC Educational Resources Information Center

    Chen, Chau-Kuang

    2010-01-01

    Artificial Neural Network (ANN) and Support Vector Machine (SVM) approaches have been on the cutting edge of science and technology for pattern recognition and data classification. In the ANN model, classification accuracy can be achieved by using the feed-forward of inputs, back-propagation of errors, and the adjustment of connection weights. In…

  11. Progressive Classification Using Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri; Kocurek, Michael

    2009-01-01

    An algorithm for progressive classification of data, analogous to progressive rendering of images, makes it possible to compromise between speed and accuracy. This algorithm uses support vector machines (SVMs) to classify data. An SVM is a machine learning algorithm that builds a mathematical model of the desired classification concept by identifying the critical data points, called support vectors. Coarse approximations to the concept require only a few support vectors, while precise, highly accurate models require far more support vectors. Once the model has been constructed, the SVM can be applied to new observations. The cost of classifying a new observation is proportional to the number of support vectors in the model. When computational resources are limited, an SVM of the appropriate complexity can be produced. However, if the constraints are not known when the model is constructed, or if they can change over time, a method for adaptively responding to the current resource constraints is required. This capability is particularly relevant for spacecraft (or any other real-time systems) that perform onboard data analysis. The new algorithm enables the fast, interactive application of an SVM classifier to a new set of data. The classification process achieved by this algorithm is characterized as progressive because a coarse approximation to the true classification is generated rapidly and thereafter iteratively refined. The algorithm uses two SVMs: (1) a fast, approximate one and (2) slow, highly accurate one. New data are initially classified by the fast SVM, producing a baseline approximate classification. For each classified data point, the algorithm calculates a confidence index that indicates the likelihood that it was classified correctly in the first pass. Next, the data points are sorted by their confidence indices and progressively reclassified by the slower, more accurate SVM, starting with the items most likely to be incorrectly classified. The user can halt this reclassification process at any point, thereby obtaining the best possible result for a given amount of computation time. Alternatively, the results can be displayed as they are generated, providing the user with real-time feedback about the current accuracy of classification.

  12. Automated image segmentation using support vector machines

    NASA Astrophysics Data System (ADS)

    Powell, Stephanie; Magnotta, Vincent A.; Andreasen, Nancy C.

    2007-03-01

    Neurodegenerative and neurodevelopmental diseases demonstrate problems associated with brain maturation and aging. Automated methods to delineate brain structures of interest are required to analyze large amounts of imaging data like that being collected in several on going multi-center studies. We have previously reported on using artificial neural networks (ANN) to define subcortical brain structures including the thalamus (0.88), caudate (0.85) and the putamen (0.81). In this work, apriori probability information was generated using Thirion's demons registration algorithm. The input vector consisted of apriori probability, spherical coordinates, and an iris of surrounding signal intensity values. We have applied the support vector machine (SVM) machine learning algorithm to automatically segment subcortical and cerebellar regions using the same input vector information. SVM architecture was derived from the ANN framework. Training was completed using a radial-basis function kernel with gamma equal to 5.5. Training was performed using 15,000 vectors collected from 15 training images in approximately 10 minutes. The resulting support vectors were applied to delineate 10 images not part of the training set. Relative overlap calculated for the subcortical structures was 0.87 for the thalamus, 0.84 for the caudate, 0.84 for the putamen, and 0.72 for the hippocampus. Relative overlap for the cerebellar lobes ranged from 0.76 to 0.86. The reliability of the SVM based algorithm was similar to the inter-rater reliability between manual raters and can be achieved without rater intervention.

  13. Plutons: Simmer between 350° and 500°C for 10 million years, then serve cold (Invited)

    NASA Astrophysics Data System (ADS)

    Coleman, D. S.; Davis, J.

    2009-12-01

    The growing recognition that continental plutons are assembled incrementally over millions of years requires reexamination of the thermal histories of intrusive rocks. With the exception of the suggestion that pluton magma chambers can be revitalized by mafic input at their deepest structural levels, most aspects of modern pluton petrology are built on the underlying assumption that silicic plutons intrude as discrete thermal packages that undergo subsequent monotonic decay back to a steady-state geothermal gradient. The recognition that homogeneous silicic plutons are constructed over timescales too great to be single events necessitates rethinking pluton intrusion mechanisms, textures, thermochronology, chemical evolution and links to volcanic rocks. Three-dimensional thermal modeling of sheeted (horizontal and vertical) incremental pluton assembly (using HEAT3D by Wohletz, 2007) yields several results that are largely independent of intrusive geometry and may help understand bothersome field and laboratory results from plutonic rocks. 1) All increments cool quickly below hornblende closure temperature. However, late increments are emplaced into walls warmed by earlier increments, and they cycle between hornblende and biotite closure temperatures, a range in which fluid-rich melts are likely to be present. These conditions persist until the increments are far from the region of new magma flux, or the addition of increments stops. These observations are supported by Ar thermochronology and may explain why heterogeneous early marginal intrusive phases often grade into younger homogeneous interior map units. 2) Early increments become the contact metamorphic wall rocks of later increments. This observation suggests that much of the contact metamorphism associated with a given volume of plutonic rock is “lost” via textural modification of early increments during intrusion of later increments. Johnson and Glazner (CMP, in press) argue that mappable variations in pluton texture can result from textural modification during thermal cycling associated with incremental assembly. 3) The thermal structure of the model pluton evolves toward roughly spheroidal isotherms even though the pluton is assembled from thin tabular sheets. The zone of melt-bearing rock and the shape of intrapluton contact metamorphic isograds bear little resemblance to the increments from which the pluton was built. Consequently, pluton contacts mapped by variations in texture that reflect the thermal cycling inherent to incremental assembly will inevitably be “blob” or diapir-like, but will yield little insight into magma intrusion geometry. 4) Although models yield large regions of melt-bearing rock, the melt fraction is low and the melt-bearing volume at any time is small compared to the total volume of the pluton. This observation raises doubts about the connections between zoned silicic plutons and large ignimbrite eruptions.

  14. Coherent Doppler Lidar for Boundary Layer Studies and Wind Energy

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya

    This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS RTM) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence.

  15. Evolutionary-driven support vector machines for determining the degree of liver fibrosis in chronic hepatitis C.

    PubMed

    Stoean, Ruxandra; Stoean, Catalin; Lupsor, Monica; Stefanescu, Horia; Badea, Radu

    2011-01-01

    Hepatic fibrosis, the principal pointer to the development of a liver disease within chronic hepatitis C, can be measured through several stages. The correct evaluation of its degree, based on recent different non-invasive procedures, is of current major concern. The latest methodology for assessing it is the Fibroscan and the effect of its employment is impressive. However, the complex interaction between its stiffness indicator and the other biochemical and clinical examinations towards a respective degree of liver fibrosis is hard to be manually discovered. In this respect, the novel, well-performing evolutionary-powered support vector machines are proposed towards an automated learning of the relationship between medical attributes and fibrosis levels. The traditional support vector machines have been an often choice for addressing hepatic fibrosis, while the evolutionary option has been validated on many real-world tasks and proven flexibility and good performance. The evolutionary approach is simple and direct, resulting from the hybridization of the learning component within support vector machines and the optimization engine of evolutionary algorithms. It discovers the optimal coefficients of surfaces that separate instances of distinct classes. Apart from a detached manner of establishing the fibrosis degree for new cases, a resulting formula also offers insight upon the correspondence between the medical factors and the respective outcome. What is more, a feature selection genetic algorithm can be further embedded into the method structure, in order to dynamically concentrate search only on the most relevant attributes. The data set refers 722 patients with chronic hepatitis C infection and 24 indicators. The five possible degrees of fibrosis range from F0 (no fibrosis) to F4 (cirrhosis). Since the standard support vector machines are among the most frequently used methods in recent artificial intelligence studies for hepatic fibrosis staging, the evolutionary method is viewed in comparison to the traditional one. The multifaceted discrimination into all five degrees of fibrosis and the slightly less difficult common separation into solely three related stages are both investigated. The resulting performance proves the superiority over the standard support vector classification and the attained formula is helpful in providing an immediate calculation of the liver stage for new cases, while establishing the presence/absence and comprehending the weight of each medical factor with respect to a certain fibrosis level. The use of the evolutionary technique for fibrosis degree prediction triggers simplicity and offers a direct expression of the influence of dynamically selected indicators on the corresponding stage. Perhaps most importantly, it significantly surpasses the classical support vector machines, which are both widely used and technically sound. All these therefore confirm the promise of the new methodology towards a dependable support within the medical decision-making. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Interpreting linear support vector machine models with heat map molecule coloring

    PubMed Central

    2011-01-01

    Background Model-based virtual screening plays an important role in the early drug discovery stage. The outcomes of high-throughput screenings are a valuable source for machine learning algorithms to infer such models. Besides a strong performance, the interpretability of a machine learning model is a desired property to guide the optimization of a compound in later drug discovery stages. Linear support vector machines showed to have a convincing performance on large-scale data sets. The goal of this study is to present a heat map molecule coloring technique to interpret linear support vector machine models. Based on the weights of a linear model, the visualization approach colors each atom and bond of a compound according to its importance for activity. Results We evaluated our approach on a toxicity data set, a chromosome aberration data set, and the maximum unbiased validation data sets. The experiments show that our method sensibly visualizes structure-property and structure-activity relationships of a linear support vector machine model. The coloring of ligands in the binding pocket of several crystal structures of a maximum unbiased validation data set target indicates that our approach assists to determine the correct ligand orientation in the binding pocket. Additionally, the heat map coloring enables the identification of substructures important for the binding of an inhibitor. Conclusions In combination with heat map coloring, linear support vector machine models can help to guide the modification of a compound in later stages of drug discovery. Particularly substructures identified as important by our method might be a starting point for optimization of a lead compound. The heat map coloring should be considered as complementary to structure based modeling approaches. As such, it helps to get a better understanding of the binding mode of an inhibitor. PMID:21439031

  17. Output-only modal parameter estimator of linear time-varying structural systems based on vector TAR model and least squares support vector machine

    NASA Astrophysics Data System (ADS)

    Zhou, Si-Da; Ma, Yuan-Chen; Liu, Li; Kang, Jie; Ma, Zhi-Sai; Yu, Lei

    2018-01-01

    Identification of time-varying modal parameters contributes to the structural health monitoring, fault detection, vibration control, etc. of the operational time-varying structural systems. However, it is a challenging task because there is not more information for the identification of the time-varying systems than that of the time-invariant systems. This paper presents a vector time-dependent autoregressive model and least squares support vector machine based modal parameter estimator for linear time-varying structural systems in case of output-only measurements. To reduce the computational cost, a Wendland's compactly supported radial basis function is used to achieve the sparsity of the Gram matrix. A Gamma-test-based non-parametric approach of selecting the regularization factor is adapted for the proposed estimator to replace the time-consuming n-fold cross validation. A series of numerical examples have illustrated the advantages of the proposed modal parameter estimator on the suppression of the overestimate and the short data. A laboratory experiment has further validated the proposed estimator.

  18. Prediction of hourly PM2.5 using a space-time support vector regression model

    NASA Astrophysics Data System (ADS)

    Yang, Wentao; Deng, Min; Xu, Feng; Wang, Hang

    2018-05-01

    Real-time air quality prediction has been an active field of research in atmospheric environmental science. The existing methods of machine learning are widely used to predict pollutant concentrations because of their enhanced ability to handle complex non-linear relationships. However, because pollutant concentration data, as typical geospatial data, also exhibit spatial heterogeneity and spatial dependence, they may violate the assumptions of independent and identically distributed random variables in most of the machine learning methods. As a result, a space-time support vector regression model is proposed to predict hourly PM2.5 concentrations. First, to address spatial heterogeneity, spatial clustering is executed to divide the study area into several homogeneous or quasi-homogeneous subareas. To handle spatial dependence, a Gauss vector weight function is then developed to determine spatial autocorrelation variables as part of the input features. Finally, a local support vector regression model with spatial autocorrelation variables is established for each subarea. Experimental data on PM2.5 concentrations in Beijing are used to verify whether the results of the proposed model are superior to those of other methods.

  19. Support vector machine for the diagnosis of malignant mesothelioma

    NASA Astrophysics Data System (ADS)

    Ushasukhanya, S.; Nithyakalyani, A.; Sivakumar, V.

    2018-04-01

    Harmful mesothelioma is an illness in which threatening (malignancy) cells shape in the covering of the trunk or stomach area. Being presented to asbestos can influence the danger of threatening mesothelioma. Signs and side effects of threatening mesothelioma incorporate shortness of breath and agony under the rib confine. Tests that inspect within the trunk and belly are utilized to recognize (find) and analyse harmful mesothelioma. Certain elements influence forecast (shot of recuperation) and treatment choices. In this review, Support vector machine (SVM) classifiers were utilized for Mesothelioma sickness conclusion. SVM output is contrasted by concentrating on Mesothelioma’s sickness and findings by utilizing similar information set. The support vector machine algorithm gives 92.5% precision acquired by means of 3-overlap cross-approval. The Mesothelioma illness dataset were taken from an organization reports from Turkey.

  20. An implementation of support vector machine on sentiment classification of movie reviews

    NASA Astrophysics Data System (ADS)

    Yulietha, I. M.; Faraby, S. A.; Adiwijaya; Widyaningtyas, W. C.

    2018-03-01

    With technological advances, all information about movie is available on the internet. If the information is processed properly, it will get the quality of the information. This research proposes to the classify sentiments on movie review documents. This research uses Support Vector Machine (SVM) method because it can classify high dimensional data in accordance with the data used in this research in the form of text. Support Vector Machine is a popular machine learning technique for text classification because it can classify by learning from a collection of documents that have been classified previously and can provide good result. Based on number of datasets, the 90-10 composition has the best result that is 85.6%. Based on SVM kernel, kernel linear with constant 1 has the best result that is 84.9%

  1. Combined empirical mode decomposition and texture features for skin lesion classification using quadratic support vector machine.

    PubMed

    Wahba, Maram A; Ashour, Amira S; Napoleon, Sameh A; Abd Elnaby, Mustafa M; Guo, Yanhui

    2017-12-01

    Basal cell carcinoma is one of the most common malignant skin lesions. Automated lesion identification and classification using image processing techniques is highly required to reduce the diagnosis errors. In this study, a novel technique is applied to classify skin lesion images into two classes, namely the malignant Basal cell carcinoma and the benign nevus. A hybrid combination of bi-dimensional empirical mode decomposition and gray-level difference method features is proposed after hair removal. The combined features are further classified using quadratic support vector machine (Q-SVM). The proposed system has achieved outstanding performance of 100% accuracy, sensitivity and specificity compared to other support vector machine procedures as well as with different extracted features. Basal Cell Carcinoma is effectively classified using Q-SVM with the proposed combined features.

  2. The optional selection of micro-motion feature based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Li, Bo; Ren, Hongmei; Xiao, Zhi-he; Sheng, Jing

    2017-11-01

    Micro-motion form of target is multiple, different micro-motion forms are apt to be modulated, which makes it difficult for feature extraction and recognition. Aiming at feature extraction of cone-shaped objects with different micro-motion forms, this paper proposes the best selection method of micro-motion feature based on support vector machine. After the time-frequency distribution of radar echoes, comparing the time-frequency spectrum of objects with different micro-motion forms, features are extracted based on the differences between the instantaneous frequency variations of different micro-motions. According to the methods based on SVM (Support Vector Machine) features are extracted, then the best features are acquired. Finally, the result shows the method proposed in this paper is feasible under the test condition of certain signal-to-noise ratio(SNR).

  3. Vaxvec: The first web-based recombinant vaccine vector database and its data analysis

    PubMed Central

    Deng, Shunzhou; Martin, Carly; Patil, Rasika; Zhu, Felix; Zhao, Bin; Xiang, Zuoshuang; He, Yongqun

    2015-01-01

    A recombinant vector vaccine uses an attenuated virus, bacterium, or parasite as the carrier to express a heterologous antigen(s). Many recombinant vaccine vectors and related vaccines have been developed and extensively investigated. To compare and better understand recombinant vectors and vaccines, we have generated Vaxvec (http://www.violinet.org/vaxvec), the first web-based database that stores various recombinant vaccine vectors and those experimentally verified vaccines that use these vectors. Vaxvec has now included 59 vaccine vectors that have been used in 196 recombinant vector vaccines against 66 pathogens and cancers. These vectors are classified to 41 viral vectors, 15 bacterial vectors, 1 parasitic vector, and 1 fungal vector. The most commonly used viral vaccine vectors are double-stranded DNA viruses, including herpesviruses, adenoviruses, and poxviruses. For example, Vaxvec includes 63 poxvirus-based recombinant vaccines for over 20 pathogens and cancers. Vaxvec collects 30 recombinant vector influenza vaccines that use 17 recombinant vectors and were experimentally tested in 7 animal models. In addition, over 60 protective antigens used in recombinant vector vaccines are annotated and analyzed. User-friendly web-interfaces are available for querying various data in Vaxvec. To support data exchange, the information of vaccine vectors, vaccines, and related information is stored in the Vaccine Ontology (VO). Vaxvec is a timely and vital source of vaccine vector database and facilitates efficient vaccine vector research and development. PMID:26403370

  4. A low cost implementation of multi-parameter patient monitor using intersection kernel support vector machine classifier

    NASA Astrophysics Data System (ADS)

    Mohan, Dhanya; Kumar, C. Santhosh

    2016-03-01

    Predicting the physiological condition (normal/abnormal) of a patient is highly desirable to enhance the quality of health care. Multi-parameter patient monitors (MPMs) using heart rate, arterial blood pressure, respiration rate and oxygen saturation (S pO2) as input parameters were developed to monitor the condition of patients, with minimum human resource utilization. The Support vector machine (SVM), an advanced machine learning approach popularly used for classification and regression is used for the realization of MPMs. For making MPMs cost effective, we experiment on the hardware implementation of the MPM using support vector machine classifier. The training of the system is done using the matlab environment and the detection of the alarm/noalarm condition is implemented in hardware. We used different kernels for SVM classification and note that the best performance was obtained using intersection kernel SVM (IKSVM). The intersection kernel support vector machine classifier MPM has outperformed the best known MPM using radial basis function kernel by an absoute improvement of 2.74% in accuracy, 1.86% in sensitivity and 3.01% in specificity. The hardware model was developed based on the improved performance system using Verilog Hardware Description Language and was implemented on Altera cyclone-II development board.

  5. On the sparseness of 1-norm support vector machines.

    PubMed

    Zhang, Li; Zhou, Weida

    2010-04-01

    There is some empirical evidence available showing that 1-norm Support Vector Machines (1-norm SVMs) have good sparseness; however, both how good sparseness 1-norm SVMs can reach and whether they have a sparser representation than that of standard SVMs are not clear. In this paper we take into account the sparseness of 1-norm SVMs. Two upper bounds on the number of nonzero coefficients in the decision function of 1-norm SVMs are presented. First, the number of nonzero coefficients in 1-norm SVMs is at most equal to the number of only the exact support vectors lying on the +1 and -1 discriminating surfaces, while that in standard SVMs is equal to the number of support vectors, which implies that 1-norm SVMs have better sparseness than that of standard SVMs. Second, the number of nonzero coefficients is at most equal to the rank of the sample matrix. A brief review of the geometry of linear programming and the primal steepest edge pricing simplex method are given, which allows us to provide the proof of the two upper bounds and evaluate their tightness by experiments. Experimental results on toy data sets and the UCI data sets illustrate our analysis. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Weighted K-means support vector machine for cancer prediction.

    PubMed

    Kim, SungHwan

    2016-01-01

    To date, the support vector machine (SVM) has been widely applied to diverse bio-medical fields to address disease subtype identification and pathogenicity of genetic variants. In this paper, I propose the weighted K-means support vector machine (wKM-SVM) and weighted support vector machine (wSVM), for which I allow the SVM to impose weights to the loss term. Besides, I demonstrate the numerical relations between the objective function of the SVM and weights. Motivated by general ensemble techniques, which are known to improve accuracy, I directly adopt the boosting algorithm to the newly proposed weighted KM-SVM (and wSVM). For predictive performance, a range of simulation studies demonstrate that the weighted KM-SVM (and wSVM) with boosting outperforms the standard KM-SVM (and SVM) including but not limited to many popular classification rules. I applied the proposed methods to simulated data and two large-scale real applications in the TCGA pan-cancer methylation data of breast and kidney cancer. In conclusion, the weighted KM-SVM (and wSVM) increases accuracy of the classification model, and will facilitate disease diagnosis and clinical treatment decisions to benefit patients. A software package (wSVM) is publicly available at the R-project webpage (https://www.r-project.org).

  7. A collaborative framework for Distributed Privacy-Preserving Support Vector Machine learning.

    PubMed

    Que, Jialan; Jiang, Xiaoqian; Ohno-Machado, Lucila

    2012-01-01

    A Support Vector Machine (SVM) is a popular tool for decision support. The traditional way to build an SVM model is to estimate parameters based on a centralized repository of data. However, in the field of biomedicine, patient data are sometimes stored in local repositories or institutions where they were collected, and may not be easily shared due to privacy concerns. This creates a substantial barrier for researchers to effectively learn from the distributed data using machine learning tools like SVMs. To overcome this difficulty and promote efficient information exchange without sharing sensitive raw data, we developed a Distributed Privacy Preserving Support Vector Machine (DPP-SVM). The DPP-SVM enables privacy-preserving collaborative learning, in which a trusted server integrates "privacy-insensitive" intermediary results. The globally learned model is guaranteed to be exactly the same as learned from combined data. We also provide a free web-service (http://privacy.ucsd.edu:8080/ppsvm/) for multiple participants to collaborate and complete the SVM-learning task in an efficient and privacy-preserving manner.

  8. Percolator: Scalable Pattern Discovery in Dynamic Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhury, Sutanay; Purohit, Sumit; Lin, Peng

    We demonstrate Percolator, a distributed system for graph pattern discovery in dynamic graphs. In contrast to conventional mining systems, Percolator advocates efficient pattern mining schemes that (1) support pattern detection with keywords; (2) integrate incremental and parallel pattern mining; and (3) support analytical queries such as trend analysis. The core idea of Percolator is to dynamically decide and verify a small fraction of patterns and their in- stances that must be inspected in response to buffered updates in dynamic graphs, with a total mining cost independent of graph size. We demonstrate a) the feasibility of incremental pattern mining by walkingmore » through each component of Percolator, b) the efficiency and scalability of Percolator over the sheer size of real-world dynamic graphs, and c) how the user-friendly GUI of Percolator inter- acts with users to support keyword-based queries that detect, browse and inspect trending patterns. We also demonstrate two user cases of Percolator, in social media trend analysis and academic collaboration analysis, respectively.« less

  9. Analysis of DISMS (Defense Integrated Subsistence Management System) Increment 4

    DTIC Science & Technology

    1988-12-01

    response data entry; and rationale supporting an on-line system based on real time management information needs. Keywords: Automated systems; Subsistence; Workload capacity; Bid response; Contract administration; Computer systems.

  10. Modelling the cost-effectiveness of mass screening and treatment for reducing Plasmodium falciparum malaria burden.

    PubMed

    Crowell, Valerie; Briët, Olivier J T; Hardy, Diggory; Chitnis, Nakul; Maire, Nicolas; Di Pasquale, Aurelio; Smith, Thomas A

    2013-01-03

    Past experience and modelling suggest that, in most cases, mass treatment strategies are not likely to succeed in interrupting Plasmodium falciparum malaria transmission. However, this does not preclude their use to reduce disease burden. Mass screening and treatment (MSAT) is preferred to mass drug administration (MDA), as the latter involves massive over-use of drugs. This paper reports simulations of the incremental cost-effectiveness of well-conducted MSAT campaigns as a strategy for P. falciparum malaria disease-burden reduction in settings with varying receptivity (ability of the combined vector population in a setting to transmit disease) and access to case management. MSAT incremental cost-effectiveness ratios (ICERs) were estimated in different sub-Saharan African settings using simulation models of the dynamics of malaria and a literature-based MSAT cost estimate. Imported infections were simulated at a rate of two per 1,000 population per annum. These estimates were compared to the ICERs of scaling up case management or insecticide-treated net (ITN) coverage in each baseline health system, in the absence of MSAT. MSAT averted most episodes, and resulted in the lowest ICERs, in settings with a moderate level of disease burden. At a low pre-intervention entomological inoculation rate (EIR) of two infectious bites per adult per annum (IBPAPA) MSAT was never more cost-effective than scaling up ITNs or case management coverage. However, at pre-intervention entomological inoculation rates (EIRs) of 20 and 50 IBPAPA and ITN coverage levels of 40 or 60%, respectively, the ICER of MSAT was similar to that of scaling up ITN coverage further. In all the transmission settings considered, achieving a minimal level of ITN coverage is a "best buy". At low transmission, MSAT probably is not worth considering. Instead, MSAT may be suitable at medium to high levels of transmission and at moderate ITN coverage. If undertaken as a burden-reducing intervention, MSAT should be continued indefinitely and should complement, not replace, case management and vector control interventions.

  11. T-ray relevant frequencies for osteosarcoma classification

    NASA Astrophysics Data System (ADS)

    Withayachumnankul, W.; Ferguson, B.; Rainsford, T.; Findlay, D.; Mickan, S. P.; Abbott, D.

    2006-01-01

    We investigate the classification of the T-ray response of normal human bone cells and human osteosarcoma cells, grown in culture. Given the magnitude and phase responses within a reliable spectral range as features for input vectors, a trained support vector machine can correctly classify the two cell types to some extent. Performance of the support vector machine is deteriorated by the curse of dimensionality, resulting from the comparatively large number of features in the input vectors. Feature subset selection methods are used to select only an optimal number of relevant features for inputs. As a result, an improvement in generalization performance is attainable, and the selected frequencies can be used for further describing different mechanisms of the cells, responding to T-rays. We demonstrate a consistent classification accuracy of 89.6%, while the only one fifth of the original features are retained in the data set.

  12. Recombinase-Mediated Cassette Exchange Using Adenoviral Vectors.

    PubMed

    Kolb, Andreas F; Knowles, Christopher; Pultinevicius, Patrikas; Harbottle, Jennifer A; Petrie, Linda; Robinson, Claire; Sorrell, David A

    2017-01-01

    Site-specific recombinases are important tools for the modification of mammalian genomes. In conjunction with viral vectors, they can be utilized to mediate site-specific gene insertions in animals and in cell lines which are difficult to transfect. Here we describe a method for the generation and analysis of an adenovirus vector supporting a recombinase-mediated cassette exchange reaction and discuss the advantages and limitations of this approach.

  13. Use of ILTV Control Laws for LaNCETS Flight Research

    NASA Technical Reports Server (NTRS)

    Moua, Cheng

    2010-01-01

    A report discusses the Lift and Nozzle Change Effects on Tail Shock (LaNCETS) test to investigate the effects of lift distribution and nozzle-area ratio changes on tail shock strength of an F-15 aircraft. Specific research objectives are to obtain inflight shock strength for multiple combinations of nozzle-area ratio and lift distribution; compare results with preflight prediction tools; and update predictive tools with flight results. The objectives from a stability and control perspective are to ensure adequate aircraft stability for the changes in lift distribution and plume shape, and ensure manageable transient from engaging and disengaging the ILTV research control laws. In order to change the lift distribution and plume shape of the F-15 aircraft, a decade-old Inner Loop Thrust Vectoring (ILTV) research control law was used. Flight envelope expansion was performed for the test configuration and flight conditions prior to the probing test points. The approach for achieving the research objectives was to utilize the unique capabilities of NASA's NF-15B-837 aircraft to allow the adjustment of the nozzle-area ratio and/or canard positions by engaging the ILTV research control laws. The ILTV control laws provide the ability to add trim command biases to canard positions, nozzle area ratios, and thrust vectoring through the use of datasets. Datasets consist of programmed test inputs (PTIs) that define trims to change the nozzle-area ratio and/or canard positions. The trims are applied as increments to the normally commanded positions. A LaNCETS non-linear, six-degrees-of-freedom simulation capable of realtime pilot-in-the-loop, hardware-in-the-loop, and non-real-time batch support was developed and validated. Prior to first flight, extensive simulation analyses were performed to show adequate stability margins with the changes in lift distribution and plume shape. Additionally, engagement/disengagement transient analysis was also performed to show manageable transients.

  14. Vaxvec: The first web-based recombinant vaccine vector database and its data analysis.

    PubMed

    Deng, Shunzhou; Martin, Carly; Patil, Rasika; Zhu, Felix; Zhao, Bin; Xiang, Zuoshuang; He, Yongqun

    2015-11-27

    A recombinant vector vaccine uses an attenuated virus, bacterium, or parasite as the carrier to express a heterologous antigen(s). Many recombinant vaccine vectors and related vaccines have been developed and extensively investigated. To compare and better understand recombinant vectors and vaccines, we have generated Vaxvec (http://www.violinet.org/vaxvec), the first web-based database that stores various recombinant vaccine vectors and those experimentally verified vaccines that use these vectors. Vaxvec has now included 59 vaccine vectors that have been used in 196 recombinant vector vaccines against 66 pathogens and cancers. These vectors are classified to 41 viral vectors, 15 bacterial vectors, 1 parasitic vector, and 1 fungal vector. The most commonly used viral vaccine vectors are double-stranded DNA viruses, including herpesviruses, adenoviruses, and poxviruses. For example, Vaxvec includes 63 poxvirus-based recombinant vaccines for over 20 pathogens and cancers. Vaxvec collects 30 recombinant vector influenza vaccines that use 17 recombinant vectors and were experimentally tested in 7 animal models. In addition, over 60 protective antigens used in recombinant vector vaccines are annotated and analyzed. User-friendly web-interfaces are available for querying various data in Vaxvec. To support data exchange, the information of vaccine vectors, vaccines, and related information is stored in the Vaccine Ontology (VO). Vaxvec is a timely and vital source of vaccine vector database and facilitates efficient vaccine vector research and development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Fuzzy support vector machines for adaptive Morse code recognition.

    PubMed

    Yang, Cheng-Hong; Jin, Li-Cheng; Chuang, Li-Yeh

    2006-11-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, facilitating mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. Therefore, an adaptive automatic recognition method with a high recognition rate is needed. The proposed system uses both fuzzy support vector machines and the variable-degree variable-step-size least-mean-square algorithm to achieve these objectives. We apply fuzzy memberships to each point, and provide different contributions to the decision learning function for support vector machines. Statistical analyses demonstrated that the proposed method elicited a higher recognition rate than other algorithms in the literature.

  16. A novel representation for apoptosis protein subcellular localization prediction using support vector machine.

    PubMed

    Zhang, Li; Liao, Bo; Li, Dachao; Zhu, Wen

    2009-07-21

    Apoptosis, or programmed cell death, plays an important role in development of an organism. Obtaining information on subcellular location of apoptosis proteins is very helpful to understand the apoptosis mechanism. In this paper, based on the concept that the position distribution information of amino acids is closely related with the structure and function of proteins, we introduce the concept of distance frequency [Matsuda, S., Vert, J.P., Ueda, N., Toh, H., Akutsu, T., 2005. A novel representation of protein sequences for prediction of subcellular location using support vector machines. Protein Sci. 14, 2804-2813] and propose a novel way to calculate distance frequencies. In order to calculate the local features, each protein sequence is separated into p parts with the same length in our paper. Then we use the novel representation of protein sequences and adopt support vector machine to predict subcellular location. The overall prediction accuracy is significantly improved by jackknife test.

  17. Product demand forecasts using wavelet kernel support vector machine and particle swarm optimization in manufacture system

    NASA Astrophysics Data System (ADS)

    Wu, Qi

    2010-03-01

    Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.

  18. Evaluation and recognition of skin images with aging by support vector machine

    NASA Astrophysics Data System (ADS)

    Hu, Liangjun; Wu, Shulian; Li, Hui

    2016-10-01

    Aging is a very important issue not only in dermatology, but also cosmetic science. Cutaneous aging involves both chronological and photoaging aging process. The evaluation and classification of aging is an important issue with the medical cosmetology workers nowadays. The purpose of this study is to assess chronological-age-related and photo-age-related of human skin. The texture features of skin surface skin, such as coarseness, contrast were analyzed by Fourier transform and Tamura. And the aim of it is to detect the object hidden in the skin texture in difference aging skin. Then, Support vector machine was applied to train the texture feature. The different age's states were distinguished by the support vector machine (SVM) classifier. The results help us to further understand the mechanism of different aging skin from texture feature and help us to distinguish the different aging states.

  19. Scattering transform and LSPTSVM based fault diagnosis of rotating machinery

    NASA Astrophysics Data System (ADS)

    Ma, Shangjun; Cheng, Bo; Shang, Zhaowei; Liu, Geng

    2018-05-01

    This paper proposes an algorithm for fault diagnosis of rotating machinery to overcome the shortcomings of classical techniques which are noise sensitive in feature extraction and time consuming for training. Based on the scattering transform and the least squares recursive projection twin support vector machine (LSPTSVM), the method has the advantages of high efficiency and insensitivity for noise signal. Using the energy of the scattering coefficients in each sub-band, the features of the vibration signals are obtained. Then, an LSPTSVM classifier is used for fault diagnosis. The new method is compared with other common methods including the proximal support vector machine, the standard support vector machine and multi-scale theory by using fault data for two systems, a motor bearing and a gear box. The results show that the new method proposed in this study is more effective for fault diagnosis of rotating machinery.

  20. Classification of Stellar Spectra with Fuzzy Minimum Within-Class Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Zhong-bao, Liu; Wen-ai, Song; Jing, Zhang; Wen-juan, Zhao

    2017-06-01

    Classification is one of the important tasks in astronomy, especially in spectra analysis. Support Vector Machine (SVM) is a typical classification method, which is widely used in spectra classification. Although it performs well in practice, its classification accuracies can not be greatly improved because of two limitations. One is it does not take the distribution of the classes into consideration. The other is it is sensitive to noise. In order to solve the above problems, inspired by the maximization of the Fisher's Discriminant Analysis (FDA) and the SVM separability constraints, fuzzy minimum within-class support vector machine (FMWSVM) is proposed in this paper. In FMWSVM, the distribution of the classes is reflected by the within-class scatter in FDA and the fuzzy membership function is introduced to decrease the influence of the noise. The comparative experiments with SVM on the SDSS datasets verify the effectiveness of the proposed classifier FMWSVM.

  1. Geographical traceability of Marsdenia tenacissima by Fourier transform infrared spectroscopy and chemometrics

    NASA Astrophysics Data System (ADS)

    Li, Chao; Yang, Sheng-Chao; Guo, Qiao-Sheng; Zheng, Kai-Yan; Wang, Ping-Li; Meng, Zhen-Gui

    2016-01-01

    A combination of Fourier transform infrared spectroscopy with chemometrics tools provided an approach for studying Marsdenia tenacissima according to its geographical origin. A total of 128 M. tenacissima samples from four provinces in China were analyzed with FTIR spectroscopy. Six pattern recognition methods were used to construct the discrimination models: support vector machine-genetic algorithms, support vector machine-particle swarm optimization, K-nearest neighbors, radial basis function neural network, random forest and support vector machine-grid search. Experimental results showed that K-nearest neighbors was superior to other mathematical algorithms after data were preprocessed with wavelet de-noising, with a discrimination rate of 100% in both the training and prediction sets. This study demonstrated that FTIR spectroscopy coupled with K-nearest neighbors could be successfully applied to determine the geographical origins of M. tenacissima samples, thereby providing reliable authentication in a rapid, cheap and noninvasive way.

  2. Object recognition of real targets using modelled SAR images

    NASA Astrophysics Data System (ADS)

    Zherdev, D. A.

    2017-12-01

    In this work the problem of recognition is studied using SAR images. The algorithm of recognition is based on the computation of conjugation indices with vectors of class. The support subspaces for each class are constructed by exception of the most and the less correlated vectors in a class. In the study we examine the ability of a significant feature vector size reduce that leads to recognition time decrease. The images of targets form the feature vectors that are transformed using pre-trained convolutional neural network (CNN).

  3. Taking the Long Road: A Faculty Model for Incremental Change towards Standards-Based Support for Sessional Teachers in Higher Education

    ERIC Educational Resources Information Center

    Savage, Julia; Pollard, Vikki

    2016-01-01

    Despite decades of dependence on sessional teaching staff, universities in Australia and internationally still find it difficult to support the teaching work of this large, casual workforce. A significant consequence of casually-employed teaching staff is risk; sessional academics' professional identity is compromised, quality assurance of…

  4. Neighbourhood walkability and home neighbourhood-based physical activity: an observational study of adults with type 2 diabetes.

    PubMed

    Hajna, Samantha; Kestens, Yan; Daskalopoulou, Stella S; Joseph, Lawrence; Thierry, Benoit; Sherman, Mark; Trudeau, Luc; Rabasa-Lhoret, Rémi; Meissner, Leslie; Bacon, Simon L; Gauvin, Lise; Ross, Nancy A; Dasgupta, Kaberi

    2016-09-09

    Converging international evidence suggests that diabetes incidence is lower among adults living in more walkable neighbourhoods. The association between walkability and physical activity (PA), the presumed mediator of this relationship, has not been carefully examined in adults with type 2 diabetes. We investigated the associations of walkability with total PA occurring within home neighbourhoods and overall PA, irrespective of location. Participants (n = 97; 59.5 ± 10.5 years) were recruited through clinics in Montreal (QC, Canada) and wore a GPS-accelerometer device for 7 days. Total PA was expressed as the total Vector of the Dynamic Body Acceleration. PA location was determined using a Global Positioning System (GPS) device (SIRF IV chip). Walkability (street connectivity, land use mix, population density) was assessed using Geographical Information Systems software. The cross-sectional associations between walkability and location-based PA were estimated using robust linear regressions adjusted for age, body mass index, sex, university education, season, car access, residential self-selection, and wear-time. A one standard deviation (SD) increment in walkability was associated with 10.4 % of a SD increment in neighbourhood-based PA (95 % confidence interval (CI) 1.2, 19.7) - equivalent to 165 more steps/day (95 % 19, 312). Car access emerged as an important predictor of neighbourhood-based PA (Not having car access: 38.6 % of a SD increment in neighbourhood-based PA, 95 % CI 17.9, 59.3). Neither walkability nor car access were conclusively associated with overall PA. Higher neighbourhood walkability is associated with higher home neighbourhood-based PA but not with higher overall PA. Other factors will need to be leveraged to facilitate meaningful increases in overall PA among adults with type 2 diabetes.

  5. Joint power and kinematics coordination in load carriage running: Implications for performance and injury.

    PubMed

    Liew, Bernard X W; Morris, Susan; Netto, Kevin

    2016-06-01

    Investigating the impact of incremental load magnitude on running joint power and kinematics is important for understanding the energy cost burden and potential injury-causative mechanisms associated with load carriage. It was hypothesized that incremental load magnitude would result in phase-specific, joint power and kinematic changes within the stance phase of running, and that these relationships would vary at different running velocities. Thirty-one participants performed running while carrying three load magnitudes (0%, 10%, 20% body weight), at three velocities (3, 4, 5m/s). Lower limb trajectories and ground reaction forces were captured, and global optimization was used to derive the variables. The relationships between load magnitude and joint power and angle vectors, at each running velocity, were analyzed using Statistical Parametric Mapping Canonical Correlation Analysis. Incremental load magnitude was positively correlated to joint power in the second half of stance. Increasing load magnitude was also positively correlated with alterations in three dimensional ankle angles during mid-stance (4.0 and 5.0m/s), knee angles at mid-stance (at 5.0m/s), and hip angles during toe-off (at all velocities). Post hoc analyses indicated that at faster running velocities (4.0 and 5.0m/s), increasing load magnitude appeared to alter power contribution in a distal-to-proximal (ankle→hip) joint sequence from mid-stance to toe-off. In addition, kinematic changes due to increasing load influenced both sagittal and non-sagittal plane lower limb joint angles. This study provides a list of plausible factors that may influence running energy cost and injury risk during load carriage running. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. On nonstationarity and antipersistency in global temperature series

    NASA Astrophysics Data System (ADS)

    KäRner, O.

    2002-10-01

    Statistical analysis is carried out for satellite-based global daily tropospheric and stratospheric temperature anomaly and solar irradiance data sets. Behavior of the series appears to be nonstationary with stationary daily increments. Estimating long-range dependence between the increments reveals a remarkable difference between the two temperature series. Global average tropospheric temperature anomaly behaves similarly to the solar irradiance anomaly. Their daily increments show antipersistency for scales longer than 2 months. The property points at a cumulative negative feedback in the Earth climate system governing the tropospheric variability during the last 22 years. The result emphasizes a dominating role of the solar irradiance variability in variations of the tropospheric temperature and gives no support to the theory of anthropogenic climate change. The global average stratospheric temperature anomaly proceeds like a 1-dim random walk at least up to 11 years, allowing good presentation by means of the autoregressive integrated moving average (ARIMA) models for monthly series.

  7. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  8. The political contradictions of incremental innovation: lessons from pharmaceutical patent examination in Brazil.

    PubMed

    Shadlen, Kenneth C

    2011-01-01

    Neodevelopmental patent regimes aim to facilitate local actors’ access to knowledge and also encourage incremental innovations. The case of pharmaceutical patent examination in Brazil illustrates political contradictions between these objectives. Brazil’s patent law includes the Ministry of Health in the examination of pharmaceutical patent applications. Though widely celebrated as a health-oriented policy, the Brazilian experience has become fraught with tensions and subject to decreasing levels of both stability and enforcement. I show how one pillar of the neodevelopmental regime, the array of initiatives to encourage incremental innovations, has fostered the acquisition of innovative capabilities in the Brazilian pharmaceutical sector, and how these new capabilities have altered actors’ policy preferences and thus contributed to the erosion of the coalition in support of the other pillar of the neodevelopmental regime, the health-oriented approach to examining pharmaceutical patents. The analysis of capability-derived preference formation points to an endogenous process of coalitional change.

  9. Start-Up and Ongoing Practice Expenses of Behavioral Health and Primary Care Integration Interventions in the Advancing Care Together (ACT) Program.

    PubMed

    Wallace, Neal T; Cohen, Deborah J; Gunn, Rose; Beck, Arne; Melek, Steve; Bechtold, Donald; Green, Larry A

    2015-01-01

    Provide credible estimates of the start-up and ongoing effort and incremental practice expenses for the Advancing Care Together (ACT) behavioral health and primary care integration interventions. Expenditure data were collected from 10 practice intervention sites using an instrument with a standardized general format that could accommodate the unique elements of each intervention. Average start-up effort expenses were $44,076 and monthly ongoing effort expenses per patient were $40.39. Incremental expenses averaged $20,788 for start-up and $4.58 per patient for monthly ongoing activities. Variations in expenditures across practices reflect the differences in intervention specifics and organizational settings. Differences in effort to incremental expenditures reflect the extensive use of existing resources in implementing the interventions. ACT program incremental expenses suggest that widespread adoption would likely have a relatively modest effect on overall health systems expenditures. Practice effort expenses are not trivial and may pose barriers to adoption. Payers and purchasers interested in attaining widespread adoption of integrated care must consider external support to practices that accounts for both incremental and effort expense levels. Existing knowledge transfer mechanisms should be employed to minimize developmental start-up expenses and payment reform focused toward value-based, Triple Aim-oriented reimbursement and purchasing mechanisms are likely needed. © Copyright 2015 by the American Board of Family Medicine.

  10. Impact of chemical plant start-up emissions on ambient ozone concentration

    NASA Astrophysics Data System (ADS)

    Ge, Sijie; Wang, Sujing; Xu, Qiang; Ho, Thomas

    2017-09-01

    Flare emissions, especially start-up flare emissions, during chemical plant operations generate large amounts of ozone precursors that may cause highly localized and transient ground-level ozone increment. Such an adverse ozone impact could be aggravated by the synergies of multiple plant start-ups in an industrial zone. In this paper, a systematic study on ozone increment superposition due to chemical plant start-up emissions has been performed. It employs dynamic flaring profiles of two olefin plants' start-ups to investigate the superposition of the regional 1-hr ozone increment. It also summaries the superposition trend by manipulating the starting time (00:00-10:00) of plant start-up operations and the plant distance (4-32 km). The study indicates that the ozone increment induced by simultaneous start-up emissions from multiple chemical plants generally does not follow the linear superposition of the ozone increment induced by individual plant start-ups. Meanwhile, the trend of such nonlinear superposition related to the temporal (starting time and operating hours of plant start-ups) and spatial (plant distance) factors is also disclosed. This paper couples dynamic simulations of chemical plant start-up operations with air-quality modeling and statistical methods to examine the regional ozone impact. It could be helpful for technical decision support for cost-effective air-quality and industrial flare emission controls.

  11. Railroad decision support tools for track maintenance.

    DOT National Transportation Integrated Search

    2016-01-01

    North American railroads spend billions of dollars each year on track maintenance. With : expenditures of this level, incremental improvements in planning or execution of maintenance projects can result in either substantial savings or the ability to...

  12. Interpreting support vector machine models for multivariate group wise analysis in neuroimaging

    PubMed Central

    Gaonkar, Bilwaj; Shinohara, Russell T; Davatzikos, Christos

    2015-01-01

    Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier’s decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification. PMID:26210913

  13. Sparse kernel methods for high-dimensional survival data.

    PubMed

    Evers, Ludger; Messow, Claudia-Martina

    2008-07-15

    Sparse kernel methods like support vector machines (SVM) have been applied with great success to classification and (standard) regression settings. Existing support vector classification and regression techniques however are not suitable for partly censored survival data, which are typically analysed using Cox's proportional hazards model. As the partial likelihood of the proportional hazards model only depends on the covariates through inner products, it can be 'kernelized'. The kernelized proportional hazards model however yields a solution that is dense, i.e. the solution depends on all observations. One of the key features of an SVM is that it yields a sparse solution, depending only on a small fraction of the training data. We propose two methods. One is based on a geometric idea, where-akin to support vector classification-the margin between the failed observation and the observations currently at risk is maximised. The other approach is based on obtaining a sparse model by adding observations one after another akin to the Import Vector Machine (IVM). Data examples studied suggest that both methods can outperform competing approaches. Software is available under the GNU Public License as an R package and can be obtained from the first author's website http://www.maths.bris.ac.uk/~maxle/software.html.

  14. Intelligent Design of Metal Oxide Gas Sensor Arrays Using Reciprocal Kernel Support Vector Regression

    NASA Astrophysics Data System (ADS)

    Dougherty, Andrew W.

    Metal oxides are a staple of the sensor industry. The combination of their sensitivity to a number of gases, and the electrical nature of their sensing mechanism, make the particularly attractive in solid state devices. The high temperature stability of the ceramic material also make them ideal for detecting combustion byproducts where exhaust temperatures can be high. However, problems do exist with metal oxide sensors. They are not very selective as they all tend to be sensitive to a number of reduction and oxidation reactions on the oxide's surface. This makes sensors with large numbers of sensors interesting to study as a method for introducing orthogonality to the system. Also, the sensors tend to suffer from long term drift for a number of reasons. In this thesis I will develop a system for intelligently modeling metal oxide sensors and determining their suitability for use in large arrays designed to analyze exhaust gas streams. It will introduce prior knowledge of the metal oxide sensors' response mechanisms in order to produce a response function for each sensor from sparse training data. The system will use the same technique to model and remove any long term drift from the sensor response. It will also provide an efficient means for determining the orthogonality of the sensor to determine whether they are useful in gas sensing arrays. The system is based on least squares support vector regression using the reciprocal kernel. The reciprocal kernel is introduced along with a method of optimizing the free parameters of the reciprocal kernel support vector machine. The reciprocal kernel is shown to be simpler and to perform better than an earlier kernel, the modified reciprocal kernel. Least squares support vector regression is chosen as it uses all of the training points and an emphasis was placed throughout this research for extracting the maximum information from very sparse data. The reciprocal kernel is shown to be effective in modeling the sensor responses in the time, gas and temperature domains, and the dual representation of the support vector regression solution is shown to provide insight into the sensor's sensitivity and potential orthogonality. Finally, the dual weights of the support vector regression solution to the sensor's response are suggested as a fitness function for a genetic algorithm, or some other method for efficiently searching large parameter spaces.

  15. Non-linear non-local molecular electrodynamics with nano-optical fields.

    PubMed

    Chernyak, Vladimir Y; Saurabh, Prasoon; Mukamel, Shaul

    2015-10-28

    The interaction of optical fields sculpted on the nano-scale with matter may not be described by the dipole approximation since the fields may vary appreciably across the molecular length scale. Rather than incrementally adding higher multipoles, it is advantageous and more physically transparent to describe the optical process using non-local response functions that intrinsically include all multipoles. We present a semi-classical approach for calculating non-local response functions based on the minimal coupling Hamiltonian. The first, second, and third order response functions are expressed in terms of correlation functions of the charge and the current densities. This approach is based on the gauge invariant current rather than the polarization, and on the vector potential rather than the electric and magnetic fields.

  16. Translation Optics for 30 cm Ion Engine Thrust Vector Control

    NASA Technical Reports Server (NTRS)

    Haag, Thomas

    2002-01-01

    Data were obtained from a 30 cm xenon ion thruster in which the accelerator grid was translated in the radial plane. The thruster was operated at three different throttle power levels, and the accelerator grid was incrementally translated in the X, Y, and azimuthal directions. Plume data was obtained downstream from the thruster using a Faraday probe mounted to a positioning system. Successive probe sweeps revealed variations in the plume direction. Thruster perveance, electron backstreaming limit, accelerator current, and plume deflection angle were taken at each power level, and for each accelerator grid position. Results showed that the thruster plume could easily be deflected up to six degrees without a prohibitive increase in accelerator impingement current. Results were similar in both X and Y direction.

  17. Vectoring of parallel synthetic jets: A parametric study

    NASA Astrophysics Data System (ADS)

    Berk, Tim; Gomit, Guillaume; Ganapathisubramani, Bharathram

    2016-11-01

    The vectoring of a pair of parallel synthetic jets can be described using five dimensionless parameters: the aspect ratio of the slots, the Strouhal number, the Reynolds number, the phase difference between the jets and the spacing between the slots. In the present study, the influence of the latter four on the vectoring behaviour of the jets is examined experimentally using particle image velocimetry. Time-averaged velocity maps are used to study the variations in vectoring behaviour for a parametric sweep of each of the four parameters independently. A topological map is constructed for the full four-dimensional parameter space. The vectoring behaviour is described both qualitatively and quantitatively. A vectoring mechanism is proposed, based on measured vortex positions. We acknowledge the financial support from the European Research Council (ERC Grant Agreement No. 277472).

  18. Determination of Anti-Adeno-Associated Virus Vector Neutralizing Antibody Titer with an In Vitro Reporter System

    PubMed Central

    Meliani, Amine; Leborgne, Christian; Triffault, Sabrina; Jeanson-Leh, Laurence; Veron, Philippe

    2015-01-01

    Abstract Adeno-associated virus (AAV) vectors are a platform of choice for in vivo gene transfer applications. However, neutralizing antibodies (NAb) to AAV can be found in humans and some animal species as a result of exposure to the wild-type virus, and high-titer NAb develop following AAV vector administration. In some conditions, anti-AAV NAb can block transduction with AAV vectors even when present at low titers, thus requiring prescreening before vector administration. Here we describe an improved in vitro, cell-based assay for the determination of NAb titer in serum or plasma samples. The assay is easy to setup and sensitive and, depending on the purpose, can be validated to support clinical development of gene therapy products based on AAV vectors. PMID:25819687

  19. Simulating Flaring Events via an Intelligent Cellular Automata Mechanism

    NASA Astrophysics Data System (ADS)

    Dimitropoulou, M.; Vlahos, L.; Isliker, H.; Georgoulis, M.

    2010-07-01

    We simulate flaring events through a Cellular Automaton (CA) model, in which, for the first time, we use observed vector magnetograms as initial conditions. After non-linear force free extrapolation of the magnetic field from the vector magnetograms, we identify magnetic discontinuities, using two alternative criteria: (1) the average magnetic field gradient, or (2) the normalized magnetic field curl (i.e. the current). Magnetic discontinuities are identified at the grid-sites where the magnetic field gradient or curl exceeds a specified threshold. We then relax the magnetic discontinuities according to the rules of Lu and Hamilton (1991) or Lu et al. (1993), i.e. we redistribute the magnetic field locally so that the discontinuities disappear. In order to simulate the flaring events, we consider several alternative scenarios with regard to: (1) The threshold above which magnetic discontinuities are identified (applying low, high, and height-dependent threshold values); (2) The driving process that occasionally causes new discontinuities (at randomly chosen grid sites, magnetic field increments are added that are perpendicular (or may-be also parallel) to the existing magnetic field). We address the question whether the coronal active region magnetic fields can indeed be considered to be in the state of self-organized criticality (SOC).

  20. Dynamic RSA: Examining parasympathetic regulatory dynamics via vector-autoregressive modeling of time-varying RSA and heart period.

    PubMed

    Fisher, Aaron J; Reeves, Jonathan W; Chi, Cyrus

    2016-07-01

    Expanding on recently published methods, the current study presents an approach to estimating the dynamic, regulatory effect of the parasympathetic nervous system on heart period on a moment-to-moment basis. We estimated second-to-second variation in respiratory sinus arrhythmia (RSA) in order to estimate the contemporaneous and time-lagged relationships among RSA, interbeat interval (IBI), and respiration rate via vector autoregression. Moreover, we modeled these relationships at lags of 1 s to 10 s, in order to evaluate the optimal latency for estimating dynamic RSA effects. The IBI (t) on RSA (t-n) regression parameter was extracted from individual models as an operationalization of the regulatory effect of RSA on IBI-referred to as dynamic RSA (dRSA). Dynamic RSA positively correlated with standard averages of heart rate and negatively correlated with standard averages of RSA. We propose that dRSA reflects the active downregulation of heart period by the parasympathetic nervous system and thus represents a novel metric that provides incremental validity in the measurement of autonomic cardiac control-specifically, a method by which parasympathetic regulatory effects can be measured in process. © 2016 Society for Psychophysiological Research.

  1. CPT1{alpha} over-expression increases long-chain fatty acid oxidation and reduces cell viability with incremental palmitic acid concentration in 293T cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jambor de Sousa, Ulrike L.; Koss, Michael D.; Fillies, Marion

    2005-12-16

    To test the cellular response to an increased fatty acid oxidation, we generated a vector for an inducible expression of the rate-limiting enzyme carnitine palmitoyl-transferase 1{alpha} (CPT1{alpha}). Human embryonic 293T kidney cells were transiently transfected and expression of the CPT1{alpha} transgene in the tet-on vector was activated with doxycycline. Fatty acid oxidation was measured by determining the conversion of supplemented, synthetic cis-10-heptadecenoic acid (C17:1n-7) to C15:ln-7. CPT1{alpha} over-expression increased mitochondrial long-chain fatty acid oxidation about 6-fold. Addition of palmitic acid (PA) decreased viability of CPT1{alpha} over-expressing cells in a concentration-dependent manner. Both, PA and CPT1{alpha} over-expression increased cell death. Interestingly,more » PA reduced total cell number only in cells over-expressing CPT1{alpha}, suggesting an effect on cell proliferation that requires PA translocation across the mitochondrial inner membrane. This inducible expression system should be well suited to study the roles of CPT1 and fatty acid oxidation in lipotoxicity and metabolism in vivo.« less

  2. Matrix Multiplication Algorithm Selection with Support Vector Machines

    DTIC Science & Technology

    2015-05-01

    libraries that could intelligently choose the optimal algorithm for a particular set of inputs. Users would be oblivious to the underlying algorithmic...SAT.” J. Artif . Intell. Res.(JAIR), vol. 32, pp. 565–606, 2008. [9] M. G. Lagoudakis and M. L. Littman, “Algorithm selection using reinforcement...Artificial Intelligence , vol. 21, no. 05, pp. 961–976, 2007. [15] C.-C. Chang and C.-J. Lin, “LIBSVM: A library for support vector machines,” ACM

  3. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set.

    PubMed

    Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang

    2017-04-26

    This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.

  4. Analysis of the horizontal structure of a measurement and control geodetic network based on entropy

    NASA Astrophysics Data System (ADS)

    Mrówczyńska, Maria

    2013-06-01

    The paper attempts to determine an optimum structure of a directional measurement and control network intended for investigating horizontal displacements. For this purpose it uses the notion of entropy as a logarithmical measure of probability of the state of a particular observation system. An optimum number of observations results from the difference of the entropy of the vector of parameters ΔHX̂ (x)corresponding to one extra observation. An increment of entropy interpreted as an increment of the amount of information about the state of the system determines the adoption or rejection of another extra observation to be carried out. W pracy podjęto próbę określenia optymalnej struktury sieci kierunkowej pomiarowo-kontrolnej przeznaczonej do badań przemieszczeń poziomych. W tym celu wykorzystano pojęcie entropii jako logarytmicznej miary prawdopodobieństwa stanu określonego układu obserwacyjnego. Optymalna liczba realizowanych obserwacji wynika z różnicy entropii wektora parametrów ΔHX̂ (x) odpowiadającej jednej obserwacji nadliczbowej. Przyrost entropii interpretowany jako przyrost objętości informacji na temat stanu układu decyduje o przyjęciu względnie odrzuceniu do realizacji kolejnej obserwacji nadliczbowej.

  5. Geometrically Nonlinear Finite Element Analysis of a Composite Space Reflector

    NASA Technical Reports Server (NTRS)

    Lee, Kee-Joo; Leet, Sung W.; Clark, Greg; Broduer, Steve (Technical Monitor)

    2001-01-01

    Lightweight aerospace structures, such as low areal density composite space reflectors, are highly flexible and may undergo large deflection under applied loading, especially during the launch phase. Accordingly, geometrically nonlinear analysis that takes into account the effect of finite rotation may be needed to determine the deformed shape for a clearance check and the stress and strain state to ensure structural integrity. In this study, deformation of the space reflector is determined under static conditions using a geometrically nonlinear solid shell finite element model. For the solid shell element formulation, the kinematics of deformation is described by six variables that are purely vector components. Because rotational angles are not used, this approach is free of the limitations of small angle increments. This also allows easy connections between substructures and large load increments with respect to the conventional shell formulation using rotational parameters. Geometrically nonlinear analyses were carried out for three cases of static point loads applied at selected points. A chart shows results for a case when the load is applied at the center point of the reflector dish. The computed results capture the nonlinear behavior of the composite reflector as the applied load increases. Also, they are in good agreement with the data obtained by experiments.

  6. Multi-resolution extension for transmission of geodata in a mobile context

    NASA Astrophysics Data System (ADS)

    Follin, Jean-Michel; Bouju, Alain; Bertrand, Frédéric; Boursier, Patrice

    2005-03-01

    A solution is proposed for the management of multi-resolution vector data in a mobile spatial information visualization system. The client-server architecture and the models of data and transfer of the system are presented first. The aim of this system is to reduce data exchanged between client and server by reusing data already present on the client side. Then, an extension of this system to multi-resolution data is proposed. Our solution is based on the use of increments in a multi-scale database. A database architecture where data sets for different predefined scales are precomputed and stored on the server side is adopted. In this model, each object representing the same real world entities at different levels of detail has to be linked beforehand. Increments correspond to the difference between two datasets with different levels of detail. They are transmitted in order to increase (or decrease) the detail to the client upon request. They include generalization and refinement operators allowing transitions between the different levels. Finally, a framework suited to the transfer of multi-resolution data in a mobile context is presented. This allows reuse of data locally available at different levels of detail and, in this way, reduces the amount of data transferred between client and server.

  7. Vectoring of parallel synthetic jets

    NASA Astrophysics Data System (ADS)

    Berk, Tim; Ganapathisubramani, Bharathram; Gomit, Guillaume

    2015-11-01

    A pair of parallel synthetic jets can be vectored by applying a phase difference between the two driving signals. The resulting jet can be merged or bifurcated and either vectored towards the actuator leading in phase or the actuator lagging in phase. In the present study, the influence of phase difference and Strouhal number on the vectoring behaviour is examined experimentally. Phase-locked vorticity fields, measured using Particle Image Velocimetry (PIV), are used to track vortex pairs. The physical mechanisms that explain the diversity in vectoring behaviour are observed based on the vortex trajectories. For a fixed phase difference, the vectoring behaviour is shown to be primarily influenced by pinch-off time of vortex rings generated by the synthetic jets. Beyond a certain formation number, the pinch-off timescale becomes invariant. In this region, the vectoring behaviour is determined by the distance between subsequent vortex rings. We acknowledge the financial support from the European Research Council (ERC grant agreement no. 277472).

  8. Power line identification of millimeter wave radar based on PCA-GS-SVM

    NASA Astrophysics Data System (ADS)

    Fang, Fang; Zhang, Guifeng; Cheng, Yansheng

    2017-12-01

    Aiming at the problem that the existing detection method can not effectively solve the security of UAV's ultra low altitude flight caused by power line, a power line recognition method based on grid search (GS) and the principal component analysis and support vector machine (PCA-SVM) is proposed. Firstly, the candidate line of Hough transform is reduced by PCA, and the main feature of candidate line is extracted. Then, upport vector machine (SVM is) optimized by grid search method (GS). Finally, using support vector machine classifier optimized parameters to classify the candidate line. MATLAB simulation results show that this method can effectively identify the power line and noise, and has high recognition accuracy and algorithm efficiency.

  9. Experimental Investigation for Fault Diagnosis Based on a Hybrid Approach Using Wavelet Packet and Support Vector Classification

    PubMed Central

    Li, Pengfei; Jiang, Yongying; Xiang, Jiawei

    2014-01-01

    To deal with the difficulty to obtain a large number of fault samples under the practical condition for mechanical fault diagnosis, a hybrid method that combined wavelet packet decomposition and support vector classification (SVC) is proposed. The wavelet packet is employed to decompose the vibration signal to obtain the energy ratio in each frequency band. Taking energy ratios as feature vectors, the pattern recognition results are obtained by the SVC. The rolling bearing and gear fault diagnostic results of the typical experimental platform show that the present approach is robust to noise and has higher classification accuracy and, thus, provides a better way to diagnose mechanical faults under the condition of small fault samples. PMID:24688361

  10. Feature selection using a one dimensional naïve Bayes' classifier increases the accuracy of support vector machine classification of CDR3 repertoires.

    PubMed

    Cinelli, Mattia; Sun, Yuxin; Best, Katharine; Heather, James M; Reich-Zeliger, Shlomit; Shifrut, Eric; Friedman, Nir; Shawe-Taylor, John; Chain, Benny

    2017-04-01

    Somatic DNA recombination, the hallmark of vertebrate adaptive immunity, has the potential to generate a vast diversity of antigen receptor sequences. How this diversity captures antigen specificity remains incompletely understood. In this study we use high throughput sequencing to compare the global changes in T cell receptor β chain complementarity determining region 3 (CDR3β) sequences following immunization with ovalbumin administered with complete Freund's adjuvant (CFA) or CFA alone. The CDR3β sequences were deconstructed into short stretches of overlapping contiguous amino acids. The motifs were ranked according to a one-dimensional Bayesian classifier score comparing their frequency in the repertoires of the two immunization classes. The top ranking motifs were selected and used to create feature vectors which were used to train a support vector machine. The support vector machine achieved high classification scores in a leave-one-out validation test reaching >90% in some cases. The study describes a novel two-stage classification strategy combining a one-dimensional Bayesian classifier with a support vector machine. Using this approach we demonstrate that the frequency of a small number of linear motifs three amino acids in length can accurately identify a CD4 T cell response to ovalbumin against a background response to the complex mixture of antigens which characterize Complete Freund's Adjuvant. The sequence data is available at www.ncbi.nlm.nih.gov/sra/?term¼SRP075893 . The Decombinator package is available at github.com/innate2adaptive/Decombinator . The R package e1071 is available at the CRAN repository https://cran.r-project.org/web/packages/e1071/index.html . b.chain@ucl.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  11. Earth observation in support of malaria control and epidemiology: MALAREO monitoring approaches.

    PubMed

    Franke, Jonas; Gebreslasie, Michael; Bauwens, Ides; Deleu, Julie; Siegert, Florian

    2015-06-03

    Malaria affects about half of the world's population, with the vast majority of cases occuring in Africa. National malaria control programmes aim to reduce the burden of malaria and its negative, socioeconomic effects by using various control strategies (e.g. vector control, environmental management and case tracking). Vector control is the most effective transmission prevention strategy, while environmental factors are the key parameters affecting transmission. Geographic information systems (GIS), earth observation (EO) and spatial modelling are increasingly being recognised as valuable tools for effective management and malaria vector control. Issues previously inhibiting the use of EO in epidemiology and malaria control such as poor satellite sensor performance, high costs and long turnaround times, have since been resolved through modern technology. The core goal of this study was to develop and implement the capabilities of EO data for national malaria control programmes in South Africa, Swaziland and Mozambique. High- and very high resolution (HR and VHR) land cover and wetland maps were generated for the identification of potential vector habitats and human activities, as well as geoinformation on distance to wetlands for malaria risk modelling, population density maps, habitat foci maps and VHR household maps. These products were further used for modelling malaria incidence and the analysis of environmental factors that favour vector breeding. Geoproducts were also transferred to the staff of national malaria control programmes in seven African countries to demonstrate how EO data and GIS can support vector control strategy planning and monitoring. The transferred EO products support better epidemiological understanding of environmental factors related to malaria transmission, and allow for spatio-temporal targeting of malaria control interventions, thereby improving the cost-effectiveness of interventions.

  12. Seminal quality prediction using data mining methods.

    PubMed

    Sahoo, Anoop J; Kumar, Yugal

    2014-01-01

    Now-a-days, some new classes of diseases have come into existences which are known as lifestyle diseases. The main reasons behind these diseases are changes in the lifestyle of people such as alcohol drinking, smoking, food habits etc. After going through the various lifestyle diseases, it has been found that the fertility rates (sperm quantity) in men has considerably been decreasing in last two decades. Lifestyle factors as well as environmental factors are mainly responsible for the change in the semen quality. The objective of this paper is to identify the lifestyle and environmental features that affects the seminal quality and also fertility rate in man using data mining methods. The five artificial intelligence techniques such as Multilayer perceptron (MLP), Decision Tree (DT), Navie Bayes (Kernel), Support vector machine+Particle swarm optimization (SVM+PSO) and Support vector machine (SVM) have been applied on fertility dataset to evaluate the seminal quality and also to predict the person is either normal or having altered fertility rate. While the eight feature selection techniques such as support vector machine (SVM), neural network (NN), evolutionary logistic regression (LR), support vector machine plus particle swarm optimization (SVM+PSO), principle component analysis (PCA), chi-square test, correlation and T-test methods have been used to identify more relevant features which affect the seminal quality. These techniques are applied on fertility dataset which contains 100 instances with nine attribute with two classes. The experimental result shows that SVM+PSO provides higher accuracy and area under curve (AUC) rate (94% & 0.932) among multi-layer perceptron (MLP) (92% & 0.728), Support Vector Machines (91% & 0.758), Navie Bayes (Kernel) (89% & 0.850) and Decision Tree (89% & 0.735) for some of the seminal parameters. This paper also focuses on the feature selection process i.e. how to select the features which are more important for prediction of fertility rate. In this paper, eight feature selection methods are applied on fertility dataset to find out a set of good features. The investigational results shows that childish diseases (0.079) and high fever features (0.057) has less impact on fertility rate while age (0.8685), season (0.843), surgical intervention (0.7683), alcohol consumption (0.5992), smoking habit (0.575), number of hours spent on setting (0.4366) and accident (0.5973) features have more impact. It is also observed that feature selection methods increase the accuracy of above mentioned techniques (multilayer perceptron 92%, support vector machine 91%, SVM+PSO 94%, Navie Bayes (Kernel) 89% and decision tree 89%) as compared to without feature selection methods (multilayer perceptron 86%, support vector machine 86%, SVM+PSO 85%, Navie Bayes (Kernel) 83% and decision tree 84%) which shows the applicability of feature selection methods in prediction. This paper lightens the application of artificial techniques in medical domain. From this paper, it can be concluded that data mining methods can be used to predict a person with or without disease based on environmental and lifestyle parameters/features rather than undergoing various medical test. In this paper, five data mining techniques are used to predict the fertility rate and among which SVM+PSO provide more accurate results than support vector machine and decision tree.

  13. A Collaborative Framework for Distributed Privacy-Preserving Support Vector Machine Learning

    PubMed Central

    Que, Jialan; Jiang, Xiaoqian; Ohno-Machado, Lucila

    2012-01-01

    A Support Vector Machine (SVM) is a popular tool for decision support. The traditional way to build an SVM model is to estimate parameters based on a centralized repository of data. However, in the field of biomedicine, patient data are sometimes stored in local repositories or institutions where they were collected, and may not be easily shared due to privacy concerns. This creates a substantial barrier for researchers to effectively learn from the distributed data using machine learning tools like SVMs. To overcome this difficulty and promote efficient information exchange without sharing sensitive raw data, we developed a Distributed Privacy Preserving Support Vector Machine (DPP-SVM). The DPP-SVM enables privacy-preserving collaborative learning, in which a trusted server integrates “privacy-insensitive” intermediary results. The globally learned model is guaranteed to be exactly the same as learned from combined data. We also provide a free web-service (http://privacy.ucsd.edu:8080/ppsvm/) for multiple participants to collaborate and complete the SVM-learning task in an efficient and privacy-preserving manner. PMID:23304414

  14. Gender bias in leader evaluations: merging implicit theories and role congruity perspectives.

    PubMed

    Hoyt, Crystal L; Burnette, Jeni L

    2013-10-01

    This research extends our understanding of gender bias in leader evaluations by merging role congruity and implicit theory perspectives. We tested and found support for the prediction that the link between people's attitudes regarding women in authority and their subsequent gender-biased leader evaluations is significantly stronger for entity theorists (those who believe attributes are fixed) relative to incremental theorists (those who believe attributes are malleable). In Study 1, 147 participants evaluated male and female gubernatorial candidates. Results supported predictions, demonstrating that traditional attitudes toward women in authority significantly predicted a pro-male gender bias in leader evaluations (and progressive attitudes predicted a pro-female gender bias) with an especially strong effect for those with more entity-oriented, relative to incrementally oriented person theories. Study 2 (119 participants) replicated these findings and demonstrated the mediating role of these attitudes in linking gender stereotypes and leader role expectations to biased evaluations.

  15. Differentiation of Enhancing Glioma and Primary Central Nervous System Lymphoma by Texture-Based Machine Learning.

    PubMed

    Alcaide-Leon, P; Dufort, P; Geraldo, A F; Alshafai, L; Maralani, P J; Spears, J; Bharatha, A

    2017-06-01

    Accurate preoperative differentiation of primary central nervous system lymphoma and enhancing glioma is essential to avoid unnecessary neurosurgical resection in patients with primary central nervous system lymphoma. The purpose of the study was to evaluate the diagnostic performance of a machine-learning algorithm by using texture analysis of contrast-enhanced T1-weighted images for differentiation of primary central nervous system lymphoma and enhancing glioma. Seventy-one adult patients with enhancing gliomas and 35 adult patients with primary central nervous system lymphomas were included. The tumors were manually contoured on contrast-enhanced T1WI, and the resulting volumes of interest were mined for textural features and subjected to a support vector machine-based machine-learning protocol. Three readers classified the tumors independently on contrast-enhanced T1WI. Areas under the receiver operating characteristic curves were estimated for each reader and for the support vector machine classifier. A noninferiority test for diagnostic accuracy based on paired areas under the receiver operating characteristic curve was performed with a noninferiority margin of 0.15. The mean areas under the receiver operating characteristic curve were 0.877 (95% CI, 0.798-0.955) for the support vector machine classifier; 0.878 (95% CI, 0.807-0.949) for reader 1; 0.899 (95% CI, 0.833-0.966) for reader 2; and 0.845 (95% CI, 0.757-0.933) for reader 3. The mean area under the receiver operating characteristic curve of the support vector machine classifier was significantly noninferior to the mean area under the curve of reader 1 ( P = .021), reader 2 ( P = .035), and reader 3 ( P = .007). Support vector machine classification based on textural features of contrast-enhanced T1WI is noninferior to expert human evaluation in the differentiation of primary central nervous system lymphoma and enhancing glioma. © 2017 by American Journal of Neuroradiology.

  16. Preparation for a first-in-man lentivirus trial in patients with cystic fibrosis

    PubMed Central

    Alton, Eric W F W; Beekman, Jeffery M; Boyd, A Christopher; Brand, June; Carlon, Marianne S; Connolly, Mary M; Chan, Mario; Conlon, Sinead; Davidson, Heather E; Davies, Jane C; Davies, Lee A; Dekkers, Johanna F; Doherty, Ann; Gea-Sorli, Sabrina; Gill, Deborah R; Griesenbach, Uta; Hasegawa, Mamoru; Higgins, Tracy E; Hironaka, Takashi; Hyndman, Laura; McLachlan, Gerry; Inoue, Makoto; Hyde, Stephen C; Innes, J Alastair; Maher, Toby M; Moran, Caroline; Meng, Cuixiang; Paul-Smith, Michael C; Pringle, Ian A; Pytel, Kamila M; Rodriguez-Martinez, Andrea; Schmidt, Alexander C; Stevenson, Barbara J; Sumner-Jones, Stephanie G; Toshner, Richard; Tsugumine, Shu; Wasowicz, Marguerite W; Zhu, Jie

    2017-01-01

    We have recently shown that non-viral gene therapy can stabilise the decline of lung function in patients with cystic fibrosis (CF). However, the effect was modest, and more potent gene transfer agents are still required. Fuson protein (F)/Hemagglutinin/Neuraminidase protein (HN)-pseudotyped lentiviral vectors are more efficient for lung gene transfer than non-viral vectors in preclinical models. In preparation for a first-in-man CF trial using the lentiviral vector, we have undertaken key translational preclinical studies. Regulatory-compliant vectors carrying a range of promoter/enhancer elements were assessed in mice and human air–liquid interface (ALI) cultures to select the lead candidate; cystic fibrosis transmembrane conductance receptor (CFTR) expression and function were assessed in CF models using this lead candidate vector. Toxicity was assessed and ‘benchmarked’ against the leading non-viral formulation recently used in a Phase IIb clinical trial. Integration site profiles were mapped and transduction efficiency determined to inform clinical trial dose-ranging. The impact of pre-existing and acquired immunity against the vector and vector stability in several clinically relevant delivery devices was assessed. A hybrid promoter hybrid cytosine guanine dinucleotide (CpG)- free CMV enhancer/elongation factor 1 alpha promoter (hCEF) consisting of the elongation factor 1α promoter and the cytomegalovirus enhancer was most efficacious in both murine lungs and human ALI cultures (both at least 2-log orders above background). The efficacy (at least 14% of airway cells transduced), toxicity and integration site profile supports further progression towards clinical trial and pre-existing and acquired immune responses do not interfere with vector efficacy. The lead rSIV.F/HN candidate expresses functional CFTR and the vector retains 90–100% transduction efficiency in clinically relevant delivery devices. The data support the progression of the F/HN-pseudotyped lentiviral vector into a first-in-man CF trial in 2017. PMID:27852956

  17. A new approach of objective quality evaluation on JPEG2000 lossy-compressed lung cancer CT images

    NASA Astrophysics Data System (ADS)

    Cai, Weihua; Tan, Yongqiang; Zhang, Jianguo

    2007-03-01

    Image compression has been used to increase the communication efficiency and storage capacity. JPEG 2000 compression, based on the wavelet transformation, has its advantages comparing to other compression methods, such as ROI coding, error resilience, adaptive binary arithmetic coding and embedded bit-stream. However it is still difficult to find an objective method to evaluate the image quality of lossy-compressed medical images so far. In this paper, we present an approach to evaluate the image quality by using a computer aided diagnosis (CAD) system. We selected 77 cases of CT images, bearing benign and malignant lung nodules with confirmed pathology, from our clinical Picture Archiving and Communication System (PACS). We have developed a prototype of CAD system to classify these images into benign ones and malignant ones, the performance of which was evaluated by the receiver operator characteristics (ROC) curves. We first used JPEG 2000 to compress these cases of images with different compression ratio from lossless to lossy, and used the CAD system to classify the cases with different compressed ratio, then compared the ROC curves from the CAD classification results. Support vector machine (SVM) and neural networks (NN) were used to classify the malignancy of input nodules. In each approach, we found that the area under ROC (AUC) decreases with the increment of compression ratio with small fluctuations.

  18. Recognition and Classification of Road Condition on the Basis of Friction Force by Using a Mobile Robot

    NASA Astrophysics Data System (ADS)

    Watanabe, Tatsuhito; Katsura, Seiichiro

    A person operating a mobile robot in a remote environment receives realistic visual feedback about the condition of the road on which the robot is moving. The categorization of the road condition is necessary to evaluate the conditions for safe and comfortable driving. For this purpose, the mobile robot should be capable of recognizing and classifying the condition of the road surfaces. This paper proposes a method for recognizing the type of road surfaces on the basis of the friction between the mobile robot and the road surfaces. This friction is estimated by a disturbance observer, and a support vector machine is used to classify the surfaces. The support vector machine identifies the type of the road surface using feature vector, which is determined using the arithmetic average and variance derived from the torque values. Further, these feature vectors are mapped onto a higher dimensional space by using a kernel function. The validity of the proposed method is confirmed by experimental results.

  19. Protein Kinase Classification with 2866 Hidden Markov Models and One Support Vector Machine

    NASA Technical Reports Server (NTRS)

    Weber, Ryan; New, Michael H.; Fonda, Mark (Technical Monitor)

    2002-01-01

    The main application considered in this paper is predicting true kinases from randomly permuted kinases that share the same length and amino acid distributions as the true kinases. Numerous methods already exist for this classification task, such as HMMs, motif-matchers, and sequence comparison algorithms. We build on some of these efforts by creating a vector from the output of thousands of structurally based HMMs, created offline with Pfam-A seed alignments using SAM-T99, which then must be combined into an overall classification for the protein. Then we use a Support Vector Machine for classifying this large ensemble Pfam-Vector, with a polynomial and chisquared kernel. In particular, the chi-squared kernel SVM performs better than the HMMs and better than the BLAST pairwise comparisons, when predicting true from false kinases in some respects, but no one algorithm is best for all purposes or in all instances so we consider the particular strengths and weaknesses of each.

  20. Salient Feature Identification and Analysis using Kernel-Based Classification Techniques for Synthetic Aperture Radar Automatic Target Recognition

    DTIC Science & Technology

    2014-03-27

    and machine learning for a range of research including such topics as medical imaging [10] and handwriting recognition [11]. The type of feature...1989. [11] C. Bahlmann, B. Haasdonk, and H. Burkhardt, “Online handwriting recognition with support vector machines-a kernel approach,” in Eighth...International Workshop on Frontiers in Handwriting Recognition, pp. 49–54, IEEE, 2002. [12] C. Cortes and V. Vapnik, “Support-vector networks,” Machine

  1. Algorithm for detection the QRS complexes based on support vector machine

    NASA Astrophysics Data System (ADS)

    Van, G. V.; Podmasteryev, K. V.

    2017-11-01

    The efficiency of computer ECG analysis depends on the accurate detection of QRS-complexes. This paper presents an algorithm for QRS complex detection based of support vector machine (SVM). The proposed algorithm is evaluated on annotated standard databases such as MIT-BIH Arrhythmia database. The QRS detector obtained a sensitivity Se = 98.32% and specificity Sp = 95.46% for MIT-BIH Arrhythmia database. This algorithm can be used as the basis for the software to diagnose electrical activity of the heart.

  2. A portable approach for PIC on emerging architectures

    NASA Astrophysics Data System (ADS)

    Decyk, Viktor

    2016-03-01

    A portable approach for designing Particle-in-Cell (PIC) algorithms on emerging exascale computers, is based on the recognition that 3 distinct programming paradigms are needed. They are: low level vector (SIMD) processing, middle level shared memory parallel programing, and high level distributed memory programming. In addition, there is a memory hierarchy associated with each level. Such algorithms can be initially developed using vectorizing compilers, OpenMP, and MPI. This is the approach recommended by Intel for the Phi processor. These algorithms can then be translated and possibly specialized to other programming models and languages, as needed. For example, the vector processing and shared memory programming might be done with CUDA instead of vectorizing compilers and OpenMP, but generally the algorithm itself is not greatly changed. The UCLA PICKSC web site at http://www.idre.ucla.edu/ contains example open source skeleton codes (mini-apps) illustrating each of these three programming models, individually and in combination. Fortran2003 now supports abstract data types, and design patterns can be used to support a variety of implementations within the same code base. Fortran2003 also supports interoperability with C so that implementations in C languages are also easy to use. Finally, main codes can be translated into dynamic environments such as Python, while still taking advantage of high performing compiled languages. Parallel languages are still evolving with interesting developments in co-Array Fortran, UPC, and OpenACC, among others, and these can also be supported within the same software architecture. Work supported by NSF and DOE Grants.

  3. Multifractal Analysis of Velocity Vector Fields and a Continuous In-Scale Cascade Model

    NASA Astrophysics Data System (ADS)

    Fitton, G.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    In this study we have compared the multifractal analyses of small-scale surface-layer wind velocities from two different datasets. The first dataset consists of six-months of wind velocity and temperature measurements at the heights 22, 23 and 43m. The measurements came from 3D sonic anemometers with a 10Hz data output rate positioned on a mast in a wind farm test site subject to wake turbulence effects. The location of the test site (Corsica, France) meant the large scale structures were subject to topography effects that therefore possibly caused buoyancy effects. The second dataset (Germany) consists of 300 twenty minute samples of horizontal wind velocity magnitudes simultaneously recorded at several positions on two masts. There are eight propeller anemometers on each mast, recording velocity magnitude data at 2.5Hz. The positioning of the anemometers is such that there are effectively two grids. One grid of 3 rows by 4 columns and a second of 5 rows by 2 columns. The ranges of temporal scale over which the analyses were done were from 1 to 103 seconds for both datasets. Thus, under the universal multifractal framework we found both datasets exhibit parameters α ≈ 1.5 and C1 ≈ 0.1. The parameters α and C1, measure respectively the multifractality and mean intermittency of the scaling field. A third parameter, H, quantifies the divergence from conservation of the field (e.g. H = 0 for the turbulent energy flux density). To estimate the parameters we used the ratio of the scaling moment function of the energy flux and of the velocity increments. This method was particularly useful when estimating the parameter α over larger scales. In fact it was not possible to obtain a reasonable estimate of alpha using the usual double trace moment method. For each case the scaling behaviour of the wind was almost isotropic when the scale ranges remained close to the sphero-scale. For the Corsica dataset this could be seen by the agreement of the spectral exponents of the order of 1.5 for all three components. Given we have only the horizontal wind components over a grid for the Germany dataset the comparable probability distributions of horizontal and vertical velocity increments shows the field is isotropic. The Germany dataset allows us to compare the spatial velocity increments with that of the temporal. We briefly mentioned above that the winds in Corsica were subject to vertical forcing effects over large scales. This means the velocity field scaled as 11/5 i.e. Bolgiano-Obukhov instead of Kolmogorov's. To test this we were required to invoke Taylor's frozen turbulence hypothesis since the data was a one point measurement. Having vertical and horizontal velocity increments means we can further justify the claims of an 11/5 scaling law for vertical shears of the velocity and test the validity of the Taylor's hypothesis. We used the results to first simulate the velocity components using continuous in-scale cascades and then discuss the reconstruction of the full vector fields.

  4. Cost-effectiveness of Lung Cancer Screening in Canada.

    PubMed

    Goffin, John R; Flanagan, William M; Miller, Anthony B; Fitzgerald, Natalie R; Memon, Saima; Wolfson, Michael C; Evans, William K

    2015-09-01

    The US National Lung Screening Trial supports screening for lung cancer among smokers using low-dose computed tomographic (LDCT) scans. The cost-effectiveness of screening in a publically funded health care system remains a concern. To assess the cost-effectiveness of LDCT scan screening for lung cancer within the Canadian health care system. The Cancer Risk Management Model (CRMM) simulated individual lives within the Canadian population from 2014 to 2034, incorporating cancer risk, disease management, outcome, and cost data. Smokers and former smokers eligible for lung cancer screening (30 pack-year smoking history, ages 55-74 years, for the reference scenario) were modeled, and performance parameters were calibrated to the National Lung Screening Trial (NLST). The reference screening scenario assumes annual scans to age 75 years, 60% participation by 10 years, 70% adherence to screening, and unchanged smoking rates. The CRMM outputs are aggregated, and costs (2008 Canadian dollars) and life-years are discounted 3% annually. The incremental cost-effectiveness ratio. Compared with no screening, the reference scenario saved 51,000 quality-adjusted life-years (QALY) and had an incremental cost-effectiveness ratio of CaD $52,000/QALY. If smoking history is modeled for 20 or 40 pack-years, incremental cost-effectiveness ratios of CaD $62,000 and CaD $43,000/QALY, respectively, were generated. Changes in participation rates altered life years saved but not the incremental cost-effectiveness ratio, while the incremental cost-effectiveness ratio is sensitive to changes in adherence. An adjunct smoking cessation program improving the quit rate by 22.5% improves the incremental cost-effectiveness ratio to CaD $24,000/QALY. Lung cancer screening with LDCT appears cost-effective in the publicly funded Canadian health care system. An adjunct smoking cessation program has the potential to improve outcomes.

  5. Summary of Recent Research Accomplishment Onboard the International Space Station—Within the United States Orbital Segment

    NASA Astrophysics Data System (ADS)

    Jules, Kenol; Istasse, Eric; Stenuit, Hilde; Murakami, Keiji; Yoshizaki, Izumi; Johnson-Green, Perry

    2011-06-01

    November 20, 2010, marked a significant milestone in the annals of human endeavors in space since it was the twelfth anniversary of one of the most challenging and complex construction projects ever attempted by humans away from our planet: The construction of the International Space Stations. On November 20, 1998, the Zarya Control Module was launched. With this simple, almost unnoticed launch in the science community, the construction of a continuously staffed research platform, in Low Earth Orbit, was underway. This paper discusses the research that was performed by many occupants of this research platform during the year celebrating its twelfth anniversary. The main objectives of this paper are fourfold: (1) to discuss the integrated manner in which science planning/replanning and prioritization during the execution phase of an increment is carried out across the United States Orbital Segment since that segment is made of four independent space agencies; (2) to discuss and summarize the research that was performed during increments 16 and 17 (October 2007 to October 2008). The discussion for these two increments is primarily focused on the main objectives of each investigation and its associated hypotheses that were investigated. Whenever available and approved, preliminary research results are also discussed for each of the investigations performed during these two increments; (3) to compare the planned research portfolio for these two increments versus what was actually accomplished during the execution phase in order to discuss the challenges associated with planning and performing research in a space laboratory located over 240 miles up in space, away from the ground support team; (4) to briefly touch on the research portfolio of increments 18 and 19/20 as the International Space Station begins its next decade in Low Earth Orbit.

  6. The specificity of parenting effects: Differential relations of parent praise and criticism to children's theories of intelligence and learning goals.

    PubMed

    Gunderson, Elizabeth A; Donnellan, M Brent; Robins, Richard W; Trzesniewski, Kali H

    2018-04-24

    Individuals who believe that intelligence can be improved with effort (an incremental theory of intelligence) and who approach challenges with the goal of improving their understanding (a learning goal) tend to have higher academic achievement. Furthermore, parent praise is associated with children's incremental theories and learning goals. However, the influences of parental criticism, as well as different forms of praise and criticism (e.g., process vs. person), have received less attention. We examine these associations by analyzing two existing datasets (Study 1: N = 317 first to eighth graders; Study 2: N = 282 fifth and eighth graders). In both studies, older children held more incremental theories of intelligence, but lower learning goals, than younger children. Unexpectedly, the relation between theories of intelligence and learning goals was nonsignificant and did not vary with children's grade level. In both studies, overall perceived parent praise positively related to children's learning goals, whereas perceived parent criticism negatively related to incremental theories of intelligence. In Study 2, perceived parent process praise was the only significant (positive) predictor of children's learning goals, whereas perceived parent person criticism was the only significant (negative) predictor of incremental theories of intelligence. Finally, Study 2 provided some support for our hypothesis that age-related differences in perceived parent praise and criticism can explain age-related differences in children's learning goals. Results suggest that incremental theories of intelligence and learning goals might not be strongly related during childhood and that perceived parent praise and criticism have important, but distinct, relations with each motivational construct. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Significance of the Human Being as an Element in an Information System: WWII Forward Air Controllers and Close Air Support

    DTIC Science & Technology

    2002-03-01

    the doctrine and the people involved as they related to the forward air control-close air support information system. Other areas that will be...discussed as they relate to the development of close air support include: incremental vs. radical change, organizational culture and change, and the...dynamic nature of current and future operations as they relate to information systems. The primary research objective is to explore

  8. When Children Learn Programming: Antecedents, Concepts and Outcomes.

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1985-01-01

    Discusses components of an educational plan which supports acquisition of computer programing skills by elementary school children, including antecedent knowledge required (sequencing, similarity, character recognition, part/whole relationships, conditional forms, repetition, and incrementation); initial programing concepts; and outcomes valuable…

  9. First experience of vectorizing electromagnetic physics models for detector simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bianchini, C.; Bitzes, G.; Brun, R.; Canal, P.; Carminati, F.; de Fine Licht, J.; Duhem, L.; Elvira, D.; Gheata, A.; Jun, S. Y.; Lima, G.; Novak, M.; Presbyterian, M.; Shadura, O.; Seghal, R.; Wenzel, S.

    2015-12-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. The GeantV vector prototype for detector simulations has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth, parallelization needed to achieve optimal performance or memory access latency and speed. An additional challenge is to avoid the code duplication often inherent to supporting heterogeneous platforms. In this paper we present the first experience of vectorizing electromagnetic physics models developed for the GeantV project.

  10. Vector-Based Data Services for NASA Earth Science

    NASA Astrophysics Data System (ADS)

    Rodriguez, J.; Roberts, J. T.; Ruvane, K.; Cechini, M. F.; Thompson, C. K.; Boller, R. A.; Baynes, K.

    2016-12-01

    Vector data sources offer opportunities for mapping and visualizing science data in a way that allows for more customizable rendering and deeper data analysis than traditional raster images, and popular formats like GeoJSON and Mapbox Vector Tiles allow diverse types of geospatial data to be served in a high-performance and easily consumed-package. Vector data is especially suited to highly dynamic mapping applications and visualization of complex datasets, while growing levels of support for vector formats and features in open-source mapping clients has made utilizing them easier and more powerful than ever. NASA's Global Imagery Browse Services (GIBS) is working to make NASA data more easily and conveniently accessible than ever by serving vector datasets via GeoJSON, Mapbox Vector Tiles, and raster images. This presentation will review these output formats, the services, including WFS, WMS, and WMTS, that can be used to access the data, and some ways in which vector sources can be utilized in popular open-source mapping clients like OpenLayers. Lessons learned from GIBS' recent move towards serving vector will be discussed, as well as how to use GIBS open source software to create, configure, and serve vector data sources using Mapserver and the GIBS OnEarth Apache module.

  11. Evidence that explains absence of a latent period for Xylella fastidiosa in its sharpshooter vectors

    USDA-ARS?s Scientific Manuscript database

    The glassy-winged sharpshooter (GWSS), Homalodisca vitripennis (Germar), and other sharpshooter (Cicadelline) leafhoppers transmit Xylella fastidiosa (Xf), the causative agent of Pierce’s disease of grapevine and other scorch diseases. Past research has supported that vectors have virtually no late...

  12. Statistical learning algorithms for identifying contrasting tillage practices with landsat thematic mapper data

    USDA-ARS?s Scientific Manuscript database

    Tillage management practices have direct impact on water holding capacity, evaporation, carbon sequestration, and water quality. This study examines the feasibility of two statistical learning algorithms, such as Least Square Support Vector Machine (LSSVM) and Relevance Vector Machine (RVM), for cla...

  13. Application of Classification Models to Pharyngeal High-Resolution Manometry

    ERIC Educational Resources Information Center

    Mielens, Jason D.; Hoffman, Matthew R.; Ciucci, Michelle R.; McCulloch, Timothy M.; Jiang, Jack J.

    2012-01-01

    Purpose: The authors present 3 methods of performing pattern recognition on spatiotemporal plots produced by pharyngeal high-resolution manometry (HRM). Method: Classification models, including the artificial neural networks (ANNs) multilayer perceptron (MLP) and learning vector quantization (LVQ), as well as support vector machines (SVM), were…

  14. A Language-Independent Approach to Automatic Text Difficulty Assessment for Second-Language Learners

    DTIC Science & Technology

    2013-08-01

    best-suited for regression. Our baseline uses z-normalized shallow length features and TF -LOG weighted vectors on bag-of-words for Arabic, Dari...length features and TF -LOG weighted vectors on bag-of-words for Arabic, Dari, English and Pashto. We compare Support Vector Machines and the Margin...football, whereas they are much less common in documents about opera). We used TF -LOG weighted word frequencies on bag-of-words for each document

  15. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    PubMed

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  16. Extraction of inland Nypa fruticans (Nipa Palm) using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Alberto, R. T.; Serrano, S. C.; Damian, G. B.; Camaso, E. E.; Biagtan, A. R.; Panuyas, N. Z.; Quibuyen, J. S.

    2017-09-01

    Mangroves are considered as one of the major habitats in coastal ecosystem, providing a lot of economic and ecological services in human society. Nypa fruticans (Nipa palm) is one of the important species of mangroves because of its versatility and uniqueness as halophytic palm. However, nipas are not only adaptable in saline areas, they can also managed to thrive away from the coastline depending on the favorable soil types available in the area. Because of this, mapping of this species are not limited alone in the near shore areas, but in areas where this species are present as well. The extraction process of Nypa fruticans were carried out using the available LiDAR data. Support Vector Machine (SVM) classification process was used to extract nipas in inland areas. The SVM classification process in mapping Nypa fruticans produced high accuracy of 95+%. The Support Vector Machine classification process to extract inland nipas was proven to be effective by utilizing different terrain derivatives from LiDAR data.

  17. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  18. Support vector machine firefly algorithm based optimization of lens system.

    PubMed

    Shamshirband, Shahaboddin; Petković, Dalibor; Pavlović, Nenad T; Ch, Sudheer; Altameem, Torki A; Gani, Abdullah

    2015-01-01

    Lens system design is an important factor in image quality. The main aspect of the lens system design methodology is the optimization procedure. Since optimization is a complex, nonlinear task, soft computing optimization algorithms can be used. There are many tools that can be employed to measure optical performance, but the spot diagram is the most useful. The spot diagram gives an indication of the image of a point object. In this paper, the spot size radius is considered an optimization criterion. Intelligent soft computing scheme support vector machines (SVMs) coupled with the firefly algorithm (FFA) are implemented. The performance of the proposed estimators is confirmed with the simulation results. The result of the proposed SVM-FFA model has been compared with support vector regression (SVR), artificial neural networks, and generic programming methods. The results show that the SVM-FFA model performs more accurately than the other methodologies. Therefore, SVM-FFA can be used as an efficient soft computing technique in the optimization of lens system designs.

  19. Using Support Vector Machines to Automatically Extract Open Water Signatures from POLDER Multi-Angle Data Over Boreal Regions

    NASA Technical Reports Server (NTRS)

    Pierce, J.; Diaz-Barrios, M.; Pinzon, J.; Ustin, S. L.; Shih, P.; Tournois, S.; Zarco-Tejada, P. J.; Vanderbilt, V. C.; Perry, G. L.; Brass, James A. (Technical Monitor)

    2002-01-01

    This study used Support Vector Machines to classify multiangle POLDER data. Boreal wetland ecosystems cover an estimated 90 x 10(exp 6) ha, about 36% of global wetlands, and are a major source of trace gases emissions to the atmosphere. Four to 20 percent of the global emission of methane to the atmosphere comes from wetlands north of 4 degrees N latitude. Large uncertainties in emissions exist because of large spatial and temporal variation in the production and consumption of methane. Accurate knowledge of the areal extent of open water and inundated vegetation is critical to estimating magnitudes of trace gas emissions. Improvements in land cover mapping have been sought using physical-modeling approaches, neural networks, and active microwave, examples that demonstrate the difficulties of separating open water, inundated vegetation and dry upland vegetation. Here we examine the feasibility of using a support vector machine to classify POLDER data representing open water, inundated vegetation and dry upland vegetation.

  20. Color image segmentation with support vector machines: applications to road signs detection.

    PubMed

    Cyganek, Bogusław

    2008-08-01

    In this paper we propose efficient color segmentation method which is based on the Support Vector Machine classifier operating in a one-class mode. The method has been developed especially for the road signs recognition system, although it can be used in other applications. The main advantage of the proposed method comes from the fact that the segmentation of characteristic colors is performed not in the original but in the higher dimensional feature space. By this a better data encapsulation with a linear hypersphere can be usually achieved. Moreover, the classifier does not try to capture the whole distribution of the input data which is often difficult to achieve. Instead, the characteristic data samples, called support vectors, are selected which allow construction of the tightest hypersphere that encloses majority of the input data. Then classification of a test data simply consists in a measurement of its distance to a centre of the found hypersphere. The experimental results show high accuracy and speed of the proposed method.

  1. Prediction of B-cell linear epitopes with a combination of support vector machine classification and amino acid propensity identification.

    PubMed

    Wang, Hsin-Wei; Lin, Ya-Chi; Pai, Tun-Wen; Chang, Hao-Teng

    2011-01-01

    Epitopes are antigenic determinants that are useful because they induce B-cell antibody production and stimulate T-cell activation. Bioinformatics can enable rapid, efficient prediction of potential epitopes. Here, we designed a novel B-cell linear epitope prediction system called LEPS, Linear Epitope Prediction by Propensities and Support Vector Machine, that combined physico-chemical propensity identification and support vector machine (SVM) classification. We tested the LEPS on four datasets: AntiJen, HIV, a newly generated PC, and AHP, a combination of these three datasets. Peptides with globally or locally high physicochemical propensities were first identified as primitive linear epitope (LE) candidates. Then, candidates were classified with the SVM based on the unique features of amino acid segments. This reduced the number of predicted epitopes and enhanced the positive prediction value (PPV). Compared to four other well-known LE prediction systems, the LEPS achieved the highest accuracy (72.52%), specificity (84.22%), PPV (32.07%), and Matthews' correlation coefficient (10.36%).

  2. Sparse Solutions for Single Class SVMs: A Bi-Criterion Approach

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Oza, Nikunj C.

    2011-01-01

    In this paper we propose an innovative learning algorithm - a variation of One-class nu Support Vector Machines (SVMs) learning algorithm to produce sparser solutions with much reduced computational complexities. The proposed technique returns an approximate solution, nearly as good as the solution set obtained by the classical approach, by minimizing the original risk function along with a regularization term. We introduce a bi-criterion optimization that helps guide the search towards the optimal set in much reduced time. The outcome of the proposed learning technique was compared with the benchmark one-class Support Vector machines algorithm which more often leads to solutions with redundant support vectors. Through out the analysis, the problem size for both optimization routines was kept consistent. We have tested the proposed algorithm on a variety of data sources under different conditions to demonstrate the effectiveness. In all cases the proposed algorithm closely preserves the accuracy of standard one-class nu SVMs while reducing both training time and test time by several factors.

  3. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  4. The Incremental Utility of Emotion Regulation but Not Emotion Reactivity in Non-Suicidal Self-Injury

    PubMed Central

    Zelkowitz, Rachel L.; Cole, David A.; Han, Gloria T.; Tomarken, Andrew J.

    2016-01-01

    This study assessed the incremental utility of emotion reactivity and emotion regulation in relation to non-suicidal self-injury (NSSI). Participants included 379 college students aged 18-22 who completed self-report measures of emotion regulation, emotion reactivity, and NSSI. Emotion regulation was significantly related to NSSI both ignoring and controlling for reactivity, but the reverse was not true. Participants' use of NSSI for affect regulation appeared to moderate this relation. Findings support emotion regulation deficits as a target for intervention over-and-above heightened emotion reactivity, especially in those who use NSSI to regulate negative affect. PMID:26945972

  5. Landslide susceptibility mapping & prediction using Support Vector Machine for Mandakini River Basin, Garhwal Himalaya, India

    NASA Astrophysics Data System (ADS)

    Kumar, Deepak; Thakur, Manoj; Dubey, Chandra S.; Shukla, Dericks P.

    2017-10-01

    In recent years, various machine learning techniques have been applied for landslide susceptibility mapping. In this study, three different variants of support vector machine viz., SVM, Proximal Support Vector Machine (PSVM) and L2-Support Vector Machine - Modified Finite Newton (L2-SVM-MFN) have been applied on the Mandakini River Basin in Uttarakhand, India to carry out the landslide susceptibility mapping. Eight thematic layers such as elevation, slope, aspect, drainages, geology/lithology, buffer of thrusts/faults, buffer of streams and soil along with the past landslide data were mapped in GIS environment and used for landslide susceptibility mapping in MATLAB. The study area covering 1625 km2 has merely 0.11% of area under landslides. There are 2009 pixels for past landslides out of which 50% (1000) landslides were considered as training set while remaining 50% as testing set. The performance of these techniques has been evaluated and the computational results show that L2-SVM-MFN obtains higher prediction values (0.829) of receiver operating characteristic curve (AUC-area under the curve) as compared to 0.807 for PSVM model and 0.79 for SVM. The results obtained from L2-SVM-MFN model are found to be superior than other SVM prediction models and suggest the usefulness of this technique to problem of landslide susceptibility mapping where training data is very less. However, these techniques can be used for satisfactory determination of susceptible zones with these inputs.

  6. Improvements on ν-Twin Support Vector Machine.

    PubMed

    Khemchandani, Reshma; Saigal, Pooja; Chandra, Suresh

    2016-07-01

    In this paper, we propose two novel binary classifiers termed as "Improvements on ν-Twin Support Vector Machine: Iν-TWSVM and Iν-TWSVM (Fast)" that are motivated by ν-Twin Support Vector Machine (ν-TWSVM). Similar to ν-TWSVM, Iν-TWSVM determines two nonparallel hyperplanes such that they are closer to their respective classes and are at least ρ distance away from the other class. The significant advantage of Iν-TWSVM over ν-TWSVM is that Iν-TWSVM solves one smaller-sized Quadratic Programming Problem (QPP) and one Unconstrained Minimization Problem (UMP); as compared to solving two related QPPs in ν-TWSVM. Further, Iν-TWSVM (Fast) avoids solving a smaller sized QPP and transforms it as a unimodal function, which can be solved using line search methods and similar to Iν-TWSVM, the other problem is solved as a UMP. Due to their novel formulation, the proposed classifiers are faster than ν-TWSVM and have comparable generalization ability. Iν-TWSVM also implements structural risk minimization (SRM) principle by introducing a regularization term, along with minimizing the empirical risk. The other properties of Iν-TWSVM, related to support vectors (SVs), are similar to that of ν-TWSVM. To test the efficacy of the proposed method, experiments have been conducted on a wide range of UCI and a skewed variation of NDC datasets. We have also given the application of Iν-TWSVM as a binary classifier for pixel classification of color images. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Novel solutions for an old disease: diagnosis of acute appendicitis with random forest, support vector machines, and artificial neural networks.

    PubMed

    Hsieh, Chung-Ho; Lu, Ruey-Hwa; Lee, Nai-Hsin; Chiu, Wen-Ta; Hsu, Min-Huei; Li, Yu-Chuan Jack

    2011-01-01

    Diagnosing acute appendicitis clinically is still difficult. We developed random forests, support vector machines, and artificial neural network models to diagnose acute appendicitis. Between January 2006 and December 2008, patients who had a consultation session with surgeons for suspected acute appendicitis were enrolled. Seventy-five percent of the data set was used to construct models including random forest, support vector machines, artificial neural networks, and logistic regression. Twenty-five percent of the data set was withheld to evaluate model performance. The area under the receiver operating characteristic curve (AUC) was used to evaluate performance, which was compared with that of the Alvarado score. Data from a total of 180 patients were collected, 135 used for training and 45 for testing. The mean age of patients was 39.4 years (range, 16-85). Final diagnosis revealed 115 patients with and 65 without appendicitis. The AUC of random forest, support vector machines, artificial neural networks, logistic regression, and Alvarado was 0.98, 0.96, 0.91, 0.87, and 0.77, respectively. The sensitivity, specificity, positive, and negative predictive values of random forest were 94%, 100%, 100%, and 87%, respectively. Random forest performed better than artificial neural networks, logistic regression, and Alvarado. We demonstrated that random forest can predict acute appendicitis with good accuracy and, deployed appropriately, can be an effective tool in clinical decision making. Copyright © 2011 Mosby, Inc. All rights reserved.

  8. Clinical Trials Using Anti-CD19/CD28/CD3zeta CAR Gammaretroviral Vector-transduced Autologous T Lymphocytes KTE-C19

    Cancer.gov

    NCI supports clinical trials that test new and more effective ways to treat cancer. Find clinical trials studying anti-cd19/cd28/cd3zeta car gammaretroviral vector-transduced autologous t lymphocytes kte-c19.

  9. Elucidating the Potential of Plant Rhabdoviruses as Vector Expressions Systems

    USDA-ARS?s Scientific Manuscript database

    Maize fine streak virus (MFSV) is a member of the genus Nucleorhabdovirus that is transmitted by the leafhopper Graminella nigrifons. The virus replicates in both its maize host and its insect vector. To determine whether Drosophila S2 cells support the production of full-length MFSV proteins, we ...

  10. Improving colon cancer screening in community clinics.

    PubMed

    Davis, Terry; Arnold, Connie; Rademaker, Alfred; Bennett, Charles; Bailey, Stacy; Platt, Daci; Reynolds, Cristalyn; Liu, Dachao; Carias, Edson; Bass, Pat; Wolf, Michael

    2013-11-01

    The authors evaluated the effectiveness and cost effectiveness of 2 interventions designed to promote colorectal cancer (CRC) screening in safety-net settings. A 3-arm, quasi-experimental evaluation was conducted among 8 clinics in Louisiana. Screening efforts included: 1) enhanced usual care, 2) literacy-informed education of patients, and 3) education plus nurse support. Overall, 961 average-risk patients ages 50 to 85 years were eligible for routine CRC screening and were recruited. Outcomes included CRC screening completion and incremental cost effectiveness using literacy-informed education of patients and education plus nurse support versus enhanced usual care. The baseline screening rate was <3%. After the interventions, the screening rate was 38.6% with enhanced usual care, 57.1% with education, and 60.6% with education that included additional nurse support. After adjusting for age, race, sex, and literacy, patients who received education alone were not more likely to complete screening than those who received enhanced usual care; and those who received additional nurse support were 1.60-fold more likely to complete screening than those who received enhanced usual care (95% confidence interval, 1.06-2.42; P = .024). The incremental cost per additional individual screened was $1337 for education plus nurse support over enhanced usual care. Fecal occult blood test rates were increased beyond enhanced usual care by providing brief education and nurse support but not by providing education alone. More cost-effective alternatives to nurse support need to be investigated. © 2013 American Cancer Society.

  11. Creating an automated trigger for sepsis clinical decision support at emergency department triage using machine learning

    PubMed Central

    Halpern, Yoni; Jernite, Yacine; Shapiro, Nathan I.; Nathanson, Larry A.

    2017-01-01

    Objective To demonstrate the incremental benefit of using free text data in addition to vital sign and demographic data to identify patients with suspected infection in the emergency department. Methods This was a retrospective, observational cohort study performed at a tertiary academic teaching hospital. All consecutive ED patient visits between 12/17/08 and 2/17/13 were included. No patients were excluded. The primary outcome measure was infection diagnosed in the emergency department defined as a patient having an infection related ED ICD-9-CM discharge diagnosis. Patients were randomly allocated to train (64%), validate (20%), and test (16%) data sets. After preprocessing the free text using bigram and negation detection, we built four models to predict infection, incrementally adding vital signs, chief complaint, and free text nursing assessment. We used two different methods to represent free text: a bag of words model and a topic model. We then used a support vector machine to build the prediction model. We calculated the area under the receiver operating characteristic curve to compare the discriminatory power of each model. Results A total of 230,936 patient visits were included in the study. Approximately 14% of patients had the primary outcome of diagnosed infection. The area under the ROC curve (AUC) for the vitals model, which used only vital signs and demographic data, was 0.67 for the training data set, 0.67 for the validation data set, and 0.67 (95% CI 0.65–0.69) for the test data set. The AUC for the chief complaint model which also included demographic and vital sign data was 0.84 for the training data set, 0.83 for the validation data set, and 0.83 (95% CI 0.81–0.84) for the test data set. The best performing methods made use of all of the free text. In particular, the AUC for the bag-of-words model was 0.89 for training data set, 0.86 for the validation data set, and 0.86 (95% CI 0.85–0.87) for the test data set. The AUC for the topic model was 0.86 for the training data set, 0.86 for the validation data set, and 0.85 (95% CI 0.84–0.86) for the test data set. Conclusion Compared to previous work that only used structured data such as vital signs and demographic information, utilizing free text drastically improves the discriminatory ability (increase in AUC from 0.67 to 0.86) of identifying infection. PMID:28384212

  12. Creating an automated trigger for sepsis clinical decision support at emergency department triage using machine learning.

    PubMed

    Horng, Steven; Sontag, David A; Halpern, Yoni; Jernite, Yacine; Shapiro, Nathan I; Nathanson, Larry A

    2017-01-01

    To demonstrate the incremental benefit of using free text data in addition to vital sign and demographic data to identify patients with suspected infection in the emergency department. This was a retrospective, observational cohort study performed at a tertiary academic teaching hospital. All consecutive ED patient visits between 12/17/08 and 2/17/13 were included. No patients were excluded. The primary outcome measure was infection diagnosed in the emergency department defined as a patient having an infection related ED ICD-9-CM discharge diagnosis. Patients were randomly allocated to train (64%), validate (20%), and test (16%) data sets. After preprocessing the free text using bigram and negation detection, we built four models to predict infection, incrementally adding vital signs, chief complaint, and free text nursing assessment. We used two different methods to represent free text: a bag of words model and a topic model. We then used a support vector machine to build the prediction model. We calculated the area under the receiver operating characteristic curve to compare the discriminatory power of each model. A total of 230,936 patient visits were included in the study. Approximately 14% of patients had the primary outcome of diagnosed infection. The area under the ROC curve (AUC) for the vitals model, which used only vital signs and demographic data, was 0.67 for the training data set, 0.67 for the validation data set, and 0.67 (95% CI 0.65-0.69) for the test data set. The AUC for the chief complaint model which also included demographic and vital sign data was 0.84 for the training data set, 0.83 for the validation data set, and 0.83 (95% CI 0.81-0.84) for the test data set. The best performing methods made use of all of the free text. In particular, the AUC for the bag-of-words model was 0.89 for training data set, 0.86 for the validation data set, and 0.86 (95% CI 0.85-0.87) for the test data set. The AUC for the topic model was 0.86 for the training data set, 0.86 for the validation data set, and 0.85 (95% CI 0.84-0.86) for the test data set. Compared to previous work that only used structured data such as vital signs and demographic information, utilizing free text drastically improves the discriminatory ability (increase in AUC from 0.67 to 0.86) of identifying infection.

  13. Towards a Decision Support System for Space Flight Operations

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Hogle, Charles; Ruszkowski, James

    2013-01-01

    The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.

  14. Vectorization and parallelization of the finite strip method for dynamic Mindlin plate problems

    NASA Technical Reports Server (NTRS)

    Chen, Hsin-Chu; He, Ai-Fang

    1993-01-01

    The finite strip method is a semi-analytical finite element process which allows for a discrete analysis of certain types of physical problems by discretizing the domain of the problem into finite strips. This method decomposes a single large problem into m smaller independent subproblems when m harmonic functions are employed, thus yielding natural parallelism at a very high level. In this paper we address vectorization and parallelization strategies for the dynamic analysis of simply-supported Mindlin plate bending problems and show how to prevent potential conflicts in memory access during the assemblage process. The vector and parallel implementations of this method and the performance results of a test problem under scalar, vector, and vector-concurrent execution modes on the Alliant FX/80 are also presented.

  15. Multiaxis Thrust-Vectoring Characteristics of a Model Representative of the F-18 High-Alpha Research Vehicle at Angles of Attack From 0 deg to 70 deg

    NASA Technical Reports Server (NTRS)

    Asbury, Scott C.; Capone, Francis J.

    1995-01-01

    An investigation was conducted in the Langley 16-Foot Transonic Tunnel to determine the multiaxis thrust-vectoring characteristics of the F-18 High-Alpha Research Vehicle (HARV). A wingtip supported, partially metric, 0.10-scale jet-effects model of an F-18 prototype aircraft was modified with hardware to simulate the thrust-vectoring control system of the HARV. Testing was conducted at free-stream Mach numbers ranging from 0.30 to 0.70, at angles of attack from O' to 70', and at nozzle pressure ratios from 1.0 to approximately 5.0. Results indicate that the thrust-vectoring control system of the HARV can successfully generate multiaxis thrust-vectoring forces and moments. During vectoring, resultant thrust vector angles were always less than the corresponding geometric vane deflection angle and were accompanied by large thrust losses. Significant external flow effects that were dependent on Mach number and angle of attack were noted during vectoring operation. Comparisons of the aerodynamic and propulsive control capabilities of the HARV configuration indicate that substantial gains in controllability are provided by the multiaxis thrust-vectoring control system.

  16. Breast cancer risk assessment and diagnosis model using fuzzy support vector machine based expert system

    NASA Astrophysics Data System (ADS)

    Dheeba, J.; Jaya, T.; Singh, N. Albert

    2017-09-01

    Classification of cancerous masses is a challenging task in many computerised detection systems. Cancerous masses are difficult to detect because these masses are obscured and subtle in mammograms. This paper investigates an intelligent classifier - fuzzy support vector machine (FSVM) applied to classify the tissues containing masses on mammograms for breast cancer diagnosis. The algorithm utilises texture features extracted using Laws texture energy measures and a FSVM to classify the suspicious masses. The new FSVM treats every feature as both normal and abnormal samples, but with different membership. By this way, the new FSVM have more generalisation ability to classify the masses in mammograms. The classifier analysed 219 clinical mammograms collected from breast cancer screening laboratory. The tests made on the real clinical mammograms shows that the proposed detection system has better discriminating power than the conventional support vector machine. With the best combination of FSVM and Laws texture features, the area under the Receiver operating characteristic curve reached .95, which corresponds to a sensitivity of 93.27% with a specificity of 87.17%. The results suggest that detecting masses using FSVM contribute to computer-aided detection of breast cancer and as a decision support system for radiologists.

  17. Foamy Virus Vector Carries a Strong Insulator in Its Long Terminal Repeat Which Reduces Its Genotoxic Potential

    PubMed Central

    2017-01-01

    ABSTRACT Strong viral enhancers in gammaretrovirus vectors have caused cellular proto-oncogene activation and leukemia, necessitating the use of cellular promoters in “enhancerless” self-inactivating integrating vectors. However, cellular promoters result in relatively low transgene expression, often leading to inadequate disease phenotype correction. Vectors derived from foamy virus, a nonpathogenic retrovirus, show higher preference for nongenic integrations than gammaretroviruses/lentiviruses and preferential integration near transcriptional start sites, like gammaretroviruses. We found that strong viral enhancers/promoters placed in foamy viral vectors caused extremely low immortalization of primary mouse hematopoietic stem/progenitor cells compared to analogous gammaretrovirus/lentivirus vectors carrying the same enhancers/promoters, an effect not explained solely by foamy virus' modest insertional site preference for nongenic regions compared to gammaretrovirus/lentivirus vectors. Using CRISPR/Cas9-mediated targeted insertion of analogous proviral sequences into the LMO2 gene and then measuring LMO2 expression, we demonstrate a sequence-specific effect of foamy virus, independent of insertional bias, contributing to reduced genotoxicity. We show that this effect is mediated by a 36-bp insulator located in the foamy virus long terminal repeat (LTR) that has high-affinity binding to the CCCTC-binding factor. Using our LMO2 activation assay, LMO2 expression was significantly increased when this insulator was removed from foamy virus and significantly reduced when the insulator was inserted into the lentiviral LTR. Our results elucidate a mechanism underlying the low genotoxicity of foamy virus, identify a novel insulator, and support the use of foamy virus as a vector for gene therapy, especially when strong enhancers/promoters are required. IMPORTANCE Understanding the genotoxic potential of viral vectors is important in designing safe and efficacious vectors for gene therapy. Self-inactivating vectors devoid of viral long-terminal-repeat enhancers have proven safe; however, transgene expression from cellular promoters is often insufficient for full phenotypic correction. Foamy virus is an attractive vector for gene therapy. We found foamy virus vectors to be remarkably less genotoxic, well below what was expected from their integration site preferences. We demonstrate that the foamy virus long terminal repeats contain an insulator element that binds CCCTC-binding factor and reduces its insertional genotoxicity. Our study elucidates a mechanism behind the low genotoxic potential of foamy virus, identifies a unique insulator, and supports the use of foamy virus as a vector for gene therapy. PMID:29046446

  18. Hybrid Model Based on Genetic Algorithms and SVM Applied to Variable Selection within Fruit Juice Classification

    PubMed Central

    Fernandez-Lozano, C.; Canto, C.; Gestal, M.; Andrade-Garda, J. M.; Rabuñal, J. R.; Dorado, J.; Pazos, A.

    2013-01-01

    Given the background of the use of Neural Networks in problems of apple juice classification, this paper aim at implementing a newly developed method in the field of machine learning: the Support Vector Machines (SVM). Therefore, a hybrid model that combines genetic algorithms and support vector machines is suggested in such a way that, when using SVM as a fitness function of the Genetic Algorithm (GA), the most representative variables for a specific classification problem can be selected. PMID:24453933

  19. Predicting healthcare associated infections using patients' experiences

    NASA Astrophysics Data System (ADS)

    Pratt, Michael A.; Chu, Henry

    2016-05-01

    Healthcare associated infections (HAI) are a major threat to patient safety and are costly to health systems. Our goal is to predict the HAI performance of a hospital using the patients' experience responses as input. We use four classifiers, viz. random forest, naive Bayes, artificial feedforward neural networks, and the support vector machine, to perform the prediction of six types of HAI. The six types include blood stream, urinary tract, surgical site, and intestinal infections. Experiments show that the random forest and support vector machine perform well across the six types of HAI.

  20. Real-time model learning using Incremental Sparse Spectrum Gaussian Process Regression.

    PubMed

    Gijsberts, Arjan; Metta, Giorgio

    2013-05-01

    Novel applications in unstructured and non-stationary human environments require robots that learn from experience and adapt autonomously to changing conditions. Predictive models therefore not only need to be accurate, but should also be updated incrementally in real-time and require minimal human intervention. Incremental Sparse Spectrum Gaussian Process Regression is an algorithm that is targeted specifically for use in this context. Rather than developing a novel algorithm from the ground up, the method is based on the thoroughly studied Gaussian Process Regression algorithm, therefore ensuring a solid theoretical foundation. Non-linearity and a bounded update complexity are achieved simultaneously by means of a finite dimensional random feature mapping that approximates a kernel function. As a result, the computational cost for each update remains constant over time. Finally, algorithmic simplicity and support for automated hyperparameter optimization ensures convenience when employed in practice. Empirical validation on a number of synthetic and real-life learning problems confirms that the performance of Incremental Sparse Spectrum Gaussian Process Regression is superior with respect to the popular Locally Weighted Projection Regression, while computational requirements are found to be significantly lower. The method is therefore particularly suited for learning with real-time constraints or when computational resources are limited. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Vector control activities: Fiscal Year, 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-04-01

    The program is divided into two major components - operations and support studies. The support studies are designed to improve the operational effectiveness and efficiency of the control program and to identify other vector control problems requiring TVA attention and study. Nonchemical methods of control are emphasized and are supplemented with chemical measures as needed. TVA also cooperates with various concerned municipalities in identifying blood-sucking arthropod pest problems and demonstrating control techniques useful in establishing abatement programs, and provides technical assistance to other TVA programs and organizations. The program also helps Land Between The Lakes (LBL) plan and conduct vectormore » control operations and tick control research. Specific program control activities and support studies are discussed.« less

  2. Improving mammography screening among the medically underserved.

    PubMed

    Davis, Terry C; Rademaker, Alfred; Bennett, Charles L; Wolf, Michael S; Carias, Edson; Reynolds, Cristalyn; Liu, Dachao; Arnold, Connie L

    2014-04-01

    We evaluated the effectiveness and cost-effectiveness of alternative interventions designed to promote mammography in safety-net settings. A three-arm, quasi-experimental evaluation was conducted among eight federally qualified health clinics in predominately rural Louisiana. Mammography screening efforts included: 1) enhanced care, 2) health literacy-informed education of patients, and 3) education plus nurse support. Outcomes included mammography screening completion within 6 months and incremental cost-effectiveness. Overall, 1,181 female patients ages 40 and over who were eligible for routine mammography were recruited. Baseline screening rates were < 10%. Post intervention screening rates were 55.7% with enhanced care, 51.8% with health literacy-informed education and 65.8% with education and nurse support. After adjusting for race, marital status, self-efficacy and literacy, patients receiving health-literacy informed education were not more likely to complete mammographic screening than those receiving enhanced care; those additionally receiving nurse support were 1.37-fold more likely to complete mammographic screening than those receiving the brief education (95% Confidence Interval 1.08-1.74, p = 0.01). The incremental cost per additional women screened was $2,457 for literacy-informed education with nurse support over literacy-informed education alone. Mammography rates were increased substantially over existing baseline rates in all three arms with the educational initiative, with nurse support and follow-up being the most effective option. However, it is not likely to be cost-effective or affordable in resource-limited clinics.

  3. Detection of distorted frames in retinal video-sequences via machine learning

    NASA Astrophysics Data System (ADS)

    Kolar, Radim; Liberdova, Ivana; Odstrcilik, Jan; Hracho, Michal; Tornow, Ralf P.

    2017-07-01

    This paper describes detection of distorted frames in retinal sequences based on set of global features extracted from each frame. The feature vector is consequently used in classification step, in which three types of classifiers are tested. The best classification accuracy 96% has been achieved with support vector machine approach.

  4. Context and Content Aware Routing of Managed Information Objects

    DTIC Science & Technology

    2014-05-01

    datatype . Siena, however, does not support incremental updates (i.e., subscription posting and deletion) and so updates must be done in batch mode...Although the present implementation of PUBSUB does not support the string datatype , its architecture is sufficiently versatile to accommodate this... datatype with the inclusion of additional data structures as de- scribed in Section 3. 3. PUBSUB Section 3.1 describes how PUBSUB organizes its database of

  5. Using the Relevance Vector Machine Model Combined with Local Phase Quantization to Predict Protein-Protein Interactions from Protein Sequences.

    PubMed

    An, Ji-Yong; Meng, Fan-Rong; You, Zhu-Hong; Fang, Yu-Hong; Zhao, Yu-Jun; Zhang, Ming

    2016-01-01

    We propose a novel computational method known as RVM-LPQ that combines the Relevance Vector Machine (RVM) model and Local Phase Quantization (LPQ) to predict PPIs from protein sequences. The main improvements are the results of representing protein sequences using the LPQ feature representation on a Position Specific Scoring Matrix (PSSM), reducing the influence of noise using a Principal Component Analysis (PCA), and using a Relevance Vector Machine (RVM) based classifier. We perform 5-fold cross-validation experiments on Yeast and Human datasets, and we achieve very high accuracies of 92.65% and 97.62%, respectively, which is significantly better than previous works. To further evaluate the proposed method, we compare it with the state-of-the-art support vector machine (SVM) classifier on the Yeast dataset. The experimental results demonstrate that our RVM-LPQ method is obviously better than the SVM-based method. The promising experimental results show the efficiency and simplicity of the proposed method, which can be an automatic decision support tool for future proteomics research.

  6. Study on vibration characteristics and fault diagnosis method of oil-immersed flat wave reactor in Arctic area converter station

    NASA Astrophysics Data System (ADS)

    Lai, Wenqing; Wang, Yuandong; Li, Wenpeng; Sun, Guang; Qu, Guomin; Cui, Shigang; Li, Mengke; Wang, Yongqiang

    2017-10-01

    Based on long term vibration monitoring of the No.2 oil-immersed fat wave reactor in the ±500kV converter station in East Mongolia, the vibration signals in normal state and in core loose fault state were saved. Through the time-frequency analysis of the signals, the vibration characteristics of the core loose fault were obtained, and a fault diagnosis method based on the dual tree complex wavelet (DT-CWT) and support vector machine (SVM) was proposed. The vibration signals were analyzed by DT-CWT, and the energy entropy of the vibration signals were taken as the feature vector; the support vector machine was used to train and test the feature vector, and the accurate identification of the core loose fault of the flat wave reactor was realized. Through the identification of many groups of normal and core loose fault state vibration signals, the diagnostic accuracy of the result reached 97.36%. The effectiveness and accuracy of the method in the fault diagnosis of the flat wave reactor core is verified.

  7. Effect of 2,6-Bis-(1-hydroxy-1,1-diphenyl-methyl) Pyridine as Organic Additive in Sulfide NiMoP/γ-Al₂O₃ Catalyst for Hydrodesulfurization of Straight-Run Gas Oil.

    PubMed

    Santolalla-Vargas, Carlos Eduardo; Santes, Victor; Meneses-Domínguez, Erick; Escamilla, Vicente; Hernández-Gordillo, Agileo; Gómez, Elizabeth; Sánchez-Minero, Felipe; Escobar, José; Díaz, Leonardo; Goiz, Oscar

    2017-08-15

    The effect of 2,6-bis-(1-hydroxy-1,1-diphenyl-methyl) pyridine (BDPHP) in the preparation of NiMoP/γ-Al₂O₃ catalysts have been investigated in the hydrodesulfurization (HDS) of straight-run gas oil. The γ-Al₂O₃ support was modified by surface impregnation of a solution of BDPHP to afford BDPHP/Ni molar ratios (0.5 and 1.0) in the final composition. The highest activity for NiMoP materials was found when the molar ratio of BDPHP/Ni was of 0.5. X-ray diffraction (XRD) results revealed that NiMoP (0.5) showed better dispersion of MoO₃ than the NiMoP (1.0). Fourier transform infrared spectroscopy (FT-IR) results indicated that the organic additive interacts with the γ-Al₂O₃ surface and therefore discards the presence of Mo or Ni complexes. Raman spectroscopy suggested a high Raman ratio for the NiMoP (0.5) sample. The increment of the Mo=O species is related to a major availability of Mo species in the formation of MoS₂. The temperature programmed reduction (TPR) results showed that the NiMoP (0.5) displayed moderate metal-support interaction. Likewise, X-ray photoelectron spectroscopy (XPS) exhibited higher sulfurization degree for NiMoP (0.5) compared with NiMoP (1.0). The increment of the MoO₃ dispersion, the moderate metal-support interaction, the increase of sulfurization degree and the increment of Mo=O species provoked by the BDPHP incorporation resulted in a higher gas oil HDS activity.

  8. Orthogonal vector algorithm to obtain the solar vector using the single-scattering Rayleigh model.

    PubMed

    Wang, Yinlong; Chu, Jinkui; Zhang, Ran; Shi, Chao

    2018-02-01

    Information obtained from a polarization pattern in the sky provides many animals like insects and birds with vital long-distance navigation cues. The solar vector can be derived from the polarization pattern using the single-scattering Rayleigh model. In this paper, an orthogonal vector algorithm, which utilizes the redundancy of the single-scattering Rayleigh model, is proposed. We use the intersection angles between the polarization vectors as the main criteria in our algorithm. The assumption that all polarization vectors can be considered coplanar is used to simplify the three-dimensional (3D) problem with respect to the polarization vectors in our simulation. The surface-normal vector of the plane, which is determined by the polarization vectors after translation, represents the solar vector. Unfortunately, the two-directionality of the polarization vectors makes the resulting solar vector ambiguous. One important result of this study is, however, that this apparent disadvantage has no effect on the complexity of the algorithm. Furthermore, two other universal least-squares algorithms were investigated and compared. A device was then constructed, which consists of five polarized-light sensors as well as a 3D attitude sensor. Both the simulation and experimental data indicate that the orthogonal vector algorithms, if used with a suitable threshold, perform equally well or better than the other two algorithms. Our experimental data reveal that if the intersection angles between the polarization vectors are close to 90°, the solar-vector angle deviations are small. The data also support the assumption of coplanarity. During the 51 min experiment, the mean of the measured solar-vector angle deviations was about 0.242°, as predicted by our theoretical model.

  9. Feature selection using a one dimensional naïve Bayes’ classifier increases the accuracy of support vector machine classification of CDR3 repertoires

    PubMed Central

    Cinelli, Mattia; Sun, , Yuxin; Best, Katharine; Heather, James M.; Reich-Zeliger, Shlomit; Shifrut, Eric; Friedman, Nir; Shawe-Taylor, John; Chain, Benny

    2017-01-01

    Abstract Motivation: Somatic DNA recombination, the hallmark of vertebrate adaptive immunity, has the potential to generate a vast diversity of antigen receptor sequences. How this diversity captures antigen specificity remains incompletely understood. In this study we use high throughput sequencing to compare the global changes in T cell receptor β chain complementarity determining region 3 (CDR3β) sequences following immunization with ovalbumin administered with complete Freund’s adjuvant (CFA) or CFA alone. Results: The CDR3β sequences were deconstructed into short stretches of overlapping contiguous amino acids. The motifs were ranked according to a one-dimensional Bayesian classifier score comparing their frequency in the repertoires of the two immunization classes. The top ranking motifs were selected and used to create feature vectors which were used to train a support vector machine. The support vector machine achieved high classification scores in a leave-one-out validation test reaching >90% in some cases. Summary: The study describes a novel two-stage classification strategy combining a one-dimensional Bayesian classifier with a support vector machine. Using this approach we demonstrate that the frequency of a small number of linear motifs three amino acids in length can accurately identify a CD4 T cell response to ovalbumin against a background response to the complex mixture of antigens which characterize Complete Freund’s Adjuvant. Availability and implementation: The sequence data is available at www.ncbi.nlm.nih.gov/sra/?term¼SRP075893. The Decombinator package is available at github.com/innate2adaptive/Decombinator. The R package e1071 is available at the CRAN repository https://cran.r-project.org/web/packages/e1071/index.html. Contact: b.chain@ucl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28073756

  10. A gene cassette for adapting Escherichia coli strains as hosts for att-Int-mediated rearrangement and pL expression vectors.

    PubMed

    Balakrishnan, R; Bolten, B; Backman, K C

    1994-01-28

    A cassette of genes from bacteriophage lambda, when carried on a derivative of bacteriophage Mu, renders strains of Escherichia coli (and in principle other Mu-sensitive bacteria) capable of supporting lambda-based expression vectors, such as rearrangement vectors and pL vectors. The gene cassette contains a temperature-sensitive allele of the repressor gene, cIts857, and a shortened leftward operon comprising, oLpL, N, xis and int. Transfection and lysogenization of this cassette into various host bacteria is mediated by phage Mu functions. Examples of regulated expression of the gene encoding T4 DNA ligase are presented.

  11. Slotted rotatable target assembly and systematic error analysis for a search for long range spin dependent interactions from exotic vector boson exchange using neutron spin rotation

    NASA Astrophysics Data System (ADS)

    Haddock, C.; Crawford, B.; Fox, W.; Francis, I.; Holley, A.; Magers, S.; Sarsour, M.; Snow, W. M.; Vanderwerp, J.

    2018-03-01

    We discuss the design and construction of a novel target array of nonmagnetic test masses used in a neutron polarimetry measurement made in search for new possible exotic spin dependent neutron-atominteractions of Nature at sub-mm length scales. This target was designed to accept and efficiently transmit a transversely polarized slow neutron beam through a series of long open parallel slots bounded by flat rectangular plates. These openings possessed equal atom density gradients normal to the slots from the flat test masses with dimensions optimized to achieve maximum sensitivity to an exotic spin-dependent interaction from vector boson exchanges with ranges in the mm - μm regime. The parallel slots were oriented differently in four quadrants that can be rotated about the neutron beam axis in discrete 90°increments using a Geneva drive. The spin rotation signals from the 4 quadrants were measured using a segmented neutron ion chamber to suppress possible systematic errors from stray magnetic fields in the target region. We discuss the per-neutron sensitivity of the target to the exotic interaction, the design constraints, the potential sources of systematic errors which could be present in this design, and our estimate of the achievable sensitivity using this method.

  12. Approaches to utilize mesenchymal progenitor cells as cellular vehicles.

    PubMed

    Pereboeva, L; Komarova, S; Mikheeva, G; Krasnykh, V; Curiel, D T

    2003-01-01

    Mammalian cells represent a novel vector approach for gene delivery that overcomes major drawbacks of viral and nonviral vectors and couples cell therapy with gene delivery. A variety of cell types have been tested in this regard, confirming that the ideal cellular vector system for ex vivo gene therapy has to comply with stringent criteria and is yet to be found. Several properties of mesenchymal progenitor cells (MPCs), such as easy access and simple isolation and propagation procedures, make these cells attractive candidates as cellular vehicles. In the current work, we evaluated the potential utility of MPCs as cellular vectors with the intent to use them in the cancer therapy context. When conventional adenoviral (Ad) vectors were used for MPC transduction, the highest transduction efficiency of MPCs was 40%. We demonstrated that Ad primary-binding receptors were poorly expressed on MPCs, while the secondary Ad receptors and integrins presented in sufficient amounts. By employing Ad vectors with incorporated integrin-binding motifs (Ad5lucRGD), MPC transduction was augmented tenfold, achieving efficient genetic loading of MPCs with reporter and anticancer genes. MPCs expressing thymidine kinase were able to exert a bystander killing effect on the cancer cell line SKOV3ip1 in vitro. In addition, we found that MPCs were able to support Ad replication, and thus can be used as cell vectors to deliver oncolytic viruses. Our results show that MPCs can foster expression of suicide genes or support replication of adenoviruses as potential anticancer therapeutic payloads. These findings are consistent with the concept that MPCs possess key properties that ensure their employment as cellular vehicles and can be used to deliver either therapeutic genes or viruses to tumor sites.

  13. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    NASA Astrophysics Data System (ADS)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  14. Concurrent validity and clinical usefulness of several individually administered tests of children's social-emotional cognition.

    PubMed

    McKown, Clark

    2007-03-01

    In this study, the validity of 5 tests of children's social-emotional cognition, defined as their encoding, memory, and interpretation of social information, was tested. Participants were 126 clinic-referred children between the ages of 5 and 17. All 5 tests were evaluated in terms of their (a) concurrent validity, (b) incremental validity, and (c) clinical usefulness in predicting social functioning. Tests included measures of nonverbal sensitivity, social language, and social problem solving. Criterion measures included parent and teacher report of social functioning. Analyses support the concurrent validity of all measures, and the incremental validity and clinical usefulness of tests of pragmatic language and problem solving.

  15. Pharmacoeconomics of ruxolitinib therapy in patients with myelofibrosis.

    PubMed

    Vandewalle, Björn; Andreozzi, Valeska; Almeida, João; Félix, Jorge

    2016-01-01

    Overall survival (OS) and other important clinical trial end-points seem increasingly more elusive in supporting rapid and efficient incorporation of innovative cancer drugs in clinical practice. This study proposes a clinical trial based pharmacoeconomic framework to assess the therapeutic and economic value of ruxolitinib in patients with intermediate-2 or high-risk myelofibrosis. Individual patient level 144 week follow-up data from the COMFORT-II trial was used to account for the crossover effect on overall survival. Lifetime treatment benefits and costs were estimated considering detailed patterns of both ruxolitinib dose adjustments and blood transfusion needs. The authors estimate a 3.3 years increment in life expectancy (HR = 0.30; 95% CI = 0.17-0.55; p-value <0.001) and an incremental cost-effectiveness ratio of €40,000 per life year gained with the use of ruxolitinib. This study also demonstrates how valuable information from clinical trials can be used to support informed decisions about the early incorporation of innovative drugs.

  16. Testing the construct validity of willingness to pay valuations using objective information about risk and health benefit.

    PubMed

    Philips, Zoë; Whynes, David K; Avis, Mark

    2006-02-01

    This paper describes an experiment to test the construct validity of contingent valuation, by eliciting women's valuations for the NHS cervical cancer screening programme. It is known that, owing to low levels of knowledge of cancer and screening in the general population, women both over-estimate the risk of disease and the efficacy of screening. The study is constructed as a randomised experiment, in which one group is provided with accurate information about cervical cancer screening, whilst the other is not. The first hypothesis supporting construct validity, that controls who perceive greater benefits from screening will offer higher valuations, is substantiated. Both groups are then provided with objective information on an improvement to the screening programme, and are asked to value the improvement as an increment to their original valuations. The second hypothesis supporting construct validity, that controls who perceive the benefits of the programme to be high already will offer lower incremental valuations, is also substantiated. Copyright 2005 John Wiley & Sons, Ltd.

  17. Polynomial interpretation of multipole vectors

    NASA Astrophysics Data System (ADS)

    Katz, Gabriel; Weeks, Jeff

    2004-09-01

    Copi, Huterer, Starkman, and Schwarz introduced multipole vectors in a tensor context and used them to demonstrate that the first-year Wilkinson microwave anisotropy probe (WMAP) quadrupole and octopole planes align at roughly the 99.9% confidence level. In the present article, the language of polynomials provides a new and independent derivation of the multipole vector concept. Bézout’s theorem supports an elementary proof that the multipole vectors exist and are unique (up to rescaling). The constructive nature of the proof leads to a fast, practical algorithm for computing multipole vectors. We illustrate the algorithm by finding exact solutions for some simple toy examples and numerical solutions for the first-year WMAP quadrupole and octopole. We then apply our algorithm to Monte Carlo skies to independently reconfirm the estimate that the WMAP quadrupole and octopole planes align at the 99.9% level.

  18. LANDMARK-BASED SPEECH RECOGNITION: REPORT OF THE 2004 JOHNS HOPKINS SUMMER WORKSHOP.

    PubMed

    Hasegawa-Johnson, Mark; Baker, James; Borys, Sarah; Chen, Ken; Coogan, Emily; Greenberg, Steven; Juneja, Amit; Kirchhoff, Katrin; Livescu, Karen; Mohan, Srividya; Muller, Jennifer; Sonmez, Kemal; Wang, Tianyu

    2005-01-01

    Three research prototype speech recognition systems are described, all of which use recently developed methods from artificial intelligence (specifically support vector machines, dynamic Bayesian networks, and maximum entropy classification) in order to implement, in the form of an automatic speech recognizer, current theories of human speech perception and phonology (specifically landmark-based speech perception, nonlinear phonology, and articulatory phonology). All three systems begin with a high-dimensional multiframe acoustic-to-distinctive feature transformation, implemented using support vector machines trained to detect and classify acoustic phonetic landmarks. Distinctive feature probabilities estimated by the support vector machines are then integrated using one of three pronunciation models: a dynamic programming algorithm that assumes canonical pronunciation of each word, a dynamic Bayesian network implementation of articulatory phonology, or a discriminative pronunciation model trained using the methods of maximum entropy classification. Log probability scores computed by these models are then combined, using log-linear combination, with other word scores available in the lattice output of a first-pass recognizer, and the resulting combination score is used to compute a second-pass speech recognition output.

  19. Experimental and computational prediction of glass transition temperature of drugs.

    PubMed

    Alzghoul, Ahmad; Alhalaweh, Amjad; Mahlin, Denny; Bergström, Christel A S

    2014-12-22

    Glass transition temperature (Tg) is an important inherent property of an amorphous solid material which is usually determined experimentally. In this study, the relation between Tg and melting temperature (Tm) was evaluated using a data set of 71 structurally diverse druglike compounds. Further, in silico models for prediction of Tg were developed based on calculated molecular descriptors and linear (multilinear regression, partial least-squares, principal component regression) and nonlinear (neural network, support vector regression) modeling techniques. The models based on Tm predicted Tg with an RMSE of 19.5 K for the test set. Among the five computational models developed herein the support vector regression gave the best result with RMSE of 18.7 K for the test set using only four chemical descriptors. Hence, two different models that predict Tg of drug-like molecules with high accuracy were developed. If Tm is available, a simple linear regression can be used to predict Tg. However, the results also suggest that support vector regression and calculated molecular descriptors can predict Tg with equal accuracy, already before compound synthesis.

  20. Anticipatory Monitoring and Control of Complex Systems using a Fuzzy based Fusion of Support Vector Regressors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miltiadis Alamaniotis; Vivek Agarwal

    This paper places itself in the realm of anticipatory systems and envisions monitoring and control methods being capable of making predictions over system critical parameters. Anticipatory systems allow intelligent control of complex systems by predicting their future state. In the current work, an intelligent model aimed at implementing anticipatory monitoring and control in energy industry is presented and tested. More particularly, a set of support vector regressors (SVRs) are trained using both historical and observed data. The trained SVRs are used to predict the future value of the system based on current operational system parameter. The predicted values are thenmore » inputted to a fuzzy logic based module where the values are fused to obtain a single value, i.e., final system output prediction. The methodology is tested on real turbine degradation datasets. The outcome of the approach presented in this paper highlights the superiority over single support vector regressors. In addition, it is shown that appropriate selection of fuzzy sets and fuzzy rules plays an important role in improving system performance.« less

  1. Agricultural mapping using Support Vector Machine-Based Endmember Extraction (SVM-BEE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archibald, Richard K; Filippi, Anthony M; Bhaduri, Budhendra L

    Extracting endmembers from remotely sensed images of vegetated areas can present difficulties. In this research, we applied a recently developed endmember-extraction algorithm based on Support Vector Machines (SVMs) to the problem of semi-autonomous estimation of vegetation endmembers from a hyperspectral image. This algorithm, referred to as Support Vector Machine-Based Endmember Extraction (SVM-BEE), accurately and rapidly yields a computed representation of hyperspectral data that can accommodate multiple distributions. The number of distributions is identified without prior knowledge, based upon this representation. Prior work established that SVM-BEE is robustly noise-tolerant and can semi-automatically and effectively estimate endmembers; synthetic data and a geologicmore » scene were previously analyzed. Here we compared the efficacies of the SVM-BEE and N-FINDR algorithms in extracting endmembers from a predominantly agricultural scene. SVM-BEE was able to estimate vegetation and other endmembers for all classes in the image, which N-FINDR failed to do. Classifications based on SVM-BEE endmembers were markedly more accurate compared with those based on N-FINDR endmembers.« less

  2. Exploring the capabilities of support vector machines in detecting silent data corruptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subasi, Omer; Di, Sheng; Bautista-Gomez, Leonardo

    As the exascale era approaches, the increasing capacity of high-performance computing (HPC) systems with targeted power and energy budget goals introduces significant challenges in reliability. Silent data corruptions (SDCs), or silent errors, are one of the major sources that corrupt the execution results of HPC applications without being detected. Here in this paper, we explore a set of novel SDC detectors – by leveraging epsilon-insensitive support vector machine regression – to detect SDCs that occur in HPC applications. The key contributions are threefold. (1) Our exploration takes temporal, spatial, and spatiotemporal features into account and analyzes different detectors based onmore » different features. (2) We provide an in-depth study on the detection ability and performance with different parameters, and we optimize the detection range carefully. (3) Experiments with eight real-world HPC applications show that support-vector-machine-based detectors can achieve detection sensitivity (i.e., recall) up to 99% yet suffer a less than 1% false positive rate for most cases. Our detectors incur low performance overhead, 5% on average, for all benchmarks studied in this work.« less

  3. Exploring the capabilities of support vector machines in detecting silent data corruptions

    DOE PAGES

    Subasi, Omer; Di, Sheng; Bautista-Gomez, Leonardo; ...

    2018-02-01

    As the exascale era approaches, the increasing capacity of high-performance computing (HPC) systems with targeted power and energy budget goals introduces significant challenges in reliability. Silent data corruptions (SDCs), or silent errors, are one of the major sources that corrupt the execution results of HPC applications without being detected. Here in this paper, we explore a set of novel SDC detectors – by leveraging epsilon-insensitive support vector machine regression – to detect SDCs that occur in HPC applications. The key contributions are threefold. (1) Our exploration takes temporal, spatial, and spatiotemporal features into account and analyzes different detectors based onmore » different features. (2) We provide an in-depth study on the detection ability and performance with different parameters, and we optimize the detection range carefully. (3) Experiments with eight real-world HPC applications show that support-vector-machine-based detectors can achieve detection sensitivity (i.e., recall) up to 99% yet suffer a less than 1% false positive rate for most cases. Our detectors incur low performance overhead, 5% on average, for all benchmarks studied in this work.« less

  4. Optimizing support vector machine learning for semi-arid vegetation mapping by using clustering analysis

    NASA Astrophysics Data System (ADS)

    Su, Lihong

    In remote sensing communities, support vector machine (SVM) learning has recently received increasing attention. SVM learning usually requires large memory and enormous amounts of computation time on large training sets. According to SVM algorithms, the SVM classification decision function is fully determined by support vectors, which compose a subset of the training sets. In this regard, a solution to optimize SVM learning is to efficiently reduce training sets. In this paper, a data reduction method based on agglomerative hierarchical clustering is proposed to obtain smaller training sets for SVM learning. Using a multiple angle remote sensing dataset of a semi-arid region, the effectiveness of the proposed method is evaluated by classification experiments with a series of reduced training sets. The experiments show that there is no loss of SVM accuracy when the original training set is reduced to 34% using the proposed approach. Maximum likelihood classification (MLC) also is applied on the reduced training sets. The results show that MLC can also maintain the classification accuracy. This implies that the most informative data instances can be retained by this approach.

  5. Clifford support vector machines for classification, regression, and recurrence.

    PubMed

    Bayro-Corrochano, Eduardo Jose; Arana-Daniel, Nancy

    2010-11-01

    This paper introduces the Clifford support vector machines (CSVM) as a generalization of the real and complex-valued support vector machines using the Clifford geometric algebra. In this framework, we handle the design of kernels involving the Clifford or geometric product. In this approach, one redefines the optimization variables as multivectors. This allows us to have a multivector as output. Therefore, we can represent multiple classes according to the dimension of the geometric algebra in which we work. We show that one can apply CSVM for classification and regression and also to build a recurrent CSVM. The CSVM is an attractive approach for the multiple input multiple output processing of high-dimensional geometric entities. We carried out comparisons between CSVM and the current approaches to solve multiclass classification and regression. We also study the performance of the recurrent CSVM with experiments involving time series. The authors believe that this paper can be of great use for researchers and practitioners interested in multiclass hypercomplex computing, particularly for applications in complex and quaternion signal and image processing, satellite control, neurocomputation, pattern recognition, computer vision, augmented virtual reality, robotics, and humanoids.

  6. Analysis of programming properties and the row-column generation method for 1-norm support vector machines.

    PubMed

    Zhang, Li; Zhou, WeiDa

    2013-12-01

    This paper deals with fast methods for training a 1-norm support vector machine (SVM). First, we define a specific class of linear programming with many sparse constraints, i.e., row-column sparse constraint linear programming (RCSC-LP). In nature, the 1-norm SVM is a sort of RCSC-LP. In order to construct subproblems for RCSC-LP and solve them, a family of row-column generation (RCG) methods is introduced. RCG methods belong to a category of decomposition techniques, and perform row and column generations in a parallel fashion. Specially, for the 1-norm SVM, the maximum size of subproblems of RCG is identical with the number of Support Vectors (SVs). We also introduce a semi-deleting rule for RCG methods and prove the convergence of RCG methods when using the semi-deleting rule. Experimental results on toy data and real-world datasets illustrate that it is efficient to use RCG to train the 1-norm SVM, especially in the case of small SVs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Application of support vector machines for copper potential mapping in Kerman region, Iran

    NASA Astrophysics Data System (ADS)

    Shabankareh, Mahdi; Hezarkhani, Ardeshir

    2017-04-01

    The first step in systematic exploration studies is mineral potential mapping, which involves classification of the study area to favorable and unfavorable parts. Support vector machines (SVM) are designed for supervised classification based on statistical learning theory. This method named support vector classification (SVC). This paper describes SVC model, which combine exploration data in the regional-scale for copper potential mapping in Kerman copper bearing belt in south of Iran. Data layers or evidential maps were in six datasets namely lithology, tectonic, airborne geophysics, ferric alteration, hydroxide alteration and geochemistry. The SVC modeling result selected 2220 pixels as favorable zones, approximately 25 percent of the study area. Besides, 66 out of 86 copper indices, approximately 78.6% of all, were located in favorable zones. Other main goal of this study was to determine how each input affects favorable output. For this purpose, the histogram of each normalized input data to its favorable output was drawn. The histograms of each input dataset for favorable output showed that each information layer had a certain pattern. These patterns of SVC results could be considered as regional copper exploration characteristics.

  8. Hybrid modelling based on support vector regression with genetic algorithms in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain).

    PubMed

    García Nieto, P J; Alonso Fernández, J R; de Cos Juez, F J; Sánchez Lasheras, F; Díaz Muñiz, C

    2013-04-01

    Cyanotoxins, a kind of poisonous substances produced by cyanobacteria, are responsible for health risks in drinking and recreational waters. As a result, anticipate its presence is a matter of importance to prevent risks. The aim of this study is to use a hybrid approach based on support vector regression (SVR) in combination with genetic algorithms (GAs), known as a genetic algorithm support vector regression (GA-SVR) model, in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain). The GA-SVR approach is aimed at highly nonlinear biological problems with sharp peaks and the tests carried out proved its high performance. Some physical-chemical parameters have been considered along with the biological ones. The results obtained are two-fold. In the first place, the significance of each biological and physical-chemical variable on the cyanotoxins presence in the reservoir is determined with success. Finally, a predictive model able to forecast the possible presence of cyanotoxins in a short term was obtained. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. The application of artificial neural networks and support vector regression for simultaneous spectrophotometric determination of commercial eye drop contents

    NASA Astrophysics Data System (ADS)

    Valizadeh, Maryam; Sohrabi, Mahmoud Reza

    2018-03-01

    In the present study, artificial neural networks (ANNs) and support vector regression (SVR) as intelligent methods coupled with UV spectroscopy for simultaneous quantitative determination of Dorzolamide (DOR) and Timolol (TIM) in eye drop. Several synthetic mixtures were analyzed for validating the proposed methods. At first, neural network time series, which one type of network from the artificial neural network was employed and its efficiency was evaluated. Afterwards, the radial basis network was applied as another neural network. Results showed that the performance of this method is suitable for predicting. Finally, support vector regression was proposed to construct the Zilomole prediction model. Also, root mean square error (RMSE) and mean recovery (%) were calculated for SVR method. Moreover, the proposed methods were compared to the high-performance liquid chromatography (HPLC) as a reference method. One way analysis of variance (ANOVA) test at the 95% confidence level applied to the comparison results of suggested and reference methods that there were no significant differences between them. Also, the effect of interferences was investigated in spike solutions.

  10. Bundles over nearly-Kahler homogeneous spaces in heterotic string theory

    NASA Astrophysics Data System (ADS)

    Klaput, Michael; Lukas, Andre; Matti, Cyril

    2011-09-01

    We construct heterotic vacua based on six-dimensional nearly-Kahler homogeneous manifolds and non-trivial vector bundles thereon. Our examples are based on three specific group coset spaces. It is shown how to construct line bundles over these spaces, compute their properties and build up vector bundles consistent with supersymmetry and anomaly cancelation. It turns out that the most interesting coset is SU(3)/U(1)2. This space supports a large number of vector bundles which lead to consistent heterotic vacua, some of them with three chiral families.

  11. Supervisor Health and Safety Support: Scale Development and Validation

    PubMed Central

    Butts, Marcus M.; Hurst, Carrie S.; Eby, Lillian T.

    2013-01-01

    Executive Summary Two studies were conducted to develop a psychometrically sound measure of supervisor health and safety support (SHSS). We identified three dimensions of supervisor support (physical health, psychological health, safety) and used Study 1 to develop items and establish content validity. Study 2 was used to establish the dimensionality of the new measure and provide criterion-related and discriminant validity evidence of the measure using supervisor and subordinate data. The measure had incremental validity in predicting employee performance and psychological strain outcomes above and beyond general work support variables. Implications of these findings and for workplace support theory and practice are discussed. PMID:24771991

  12. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    PubMed Central

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355

  13. A Mathematical and Sociological Analysis of Google Search Algorithm

    DTIC Science & Technology

    2013-01-16

    through the collective intelligence of the web to determine a page’s importance. Let v be a vector of RN with N ≥ 8 billion. Any unit vector in RN is...scrolled up by some artifical hits. Aknowledgment: The authors would like to thank Dr. John Lavery for his encouragement and support which enable them to

  14. Automated Creation of Labeled Pointcloud Datasets in Support of Machine-Learning Based Perception

    DTIC Science & Technology

    2017-12-01

    computationally intensive 3D vector math and took more than ten seconds to segment a single LIDAR frame from the HDL-32e with the Dell XPS15 9650’s Intel...Core i7 CPU. Depth Clustering avoids the computationally intensive 3D vector math of Euclidean Clustering-based DON segmentation and, instead

  15. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    PubMed

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  16. Effects of Cucumber mosaic virus infection on vector and non-vector herbivores of squash.

    PubMed

    Mauck, Kerry E; De Moraes, Consuelo M; Mescher, Mark C

    2010-11-01

    Plant chemicals mediating interactions with insect herbivores seem a likely target for manipulation by insectvectored plant pathogens. Yet, little is currently known about the chemical ecology of insect-vectored diseases or their effects on the ecology of vector and nonvector insects. We recently reported that a widespread plant pathogen, Cucumber mosaic virus (CMV), greatly reduces the quality of host-plants (squash) for aphid vectors, but that aphids are nevertheless attracted to the odors of infected plants-which exhibit elevated emissions of a volatile blend otherwise similar to the odor of healthy plants. This finding suggests that exaggerating existing host-location cues can be a viable vector attraction strategy for pathogens that otherwise reduce host quality for vectors. Here we report additional data regarding the effects of CMV infection on plant interactions with a common nonvector herbivore, the squash bug, Anasa tristis, which is a pest in this system. We found that adult A. tristis females preferred to oviposit on healthy plants in the field, and that healthy plants supported higher populations of nymphs. Collectively, our recent findings suggest that CMV-induced changes in host plant chemistry influence the behavior of both vector and non-vector herbivores, with significant implications both for disease spread and for broader community-level interactions.

  17. Scorebox extraction from mobile sports videos using Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Kim, Wonjun; Park, Jimin; Kim, Changick

    2008-08-01

    Scorebox plays an important role in understanding contents of sports videos. However, the tiny scorebox may give the small-display-viewers uncomfortable experience in grasping the game situation. In this paper, we propose a novel framework to extract the scorebox from sports video frames. We first extract candidates by using accumulated intensity and edge information after short learning period. Since there are various types of scoreboxes inserted in sports videos, multiple attributes need to be used for efficient extraction. Based on those attributes, the optimal information gain is computed and top three ranked attributes in terms of information gain are selected as a three-dimensional feature vector for Support Vector Machines (SVM) to distinguish the scorebox from other candidates, such as logos and advertisement boards. The proposed method is tested on various videos of sports games and experimental results show the efficiency and robustness of our proposed method.

  18. Discontinuity Detection in the Shield Metal Arc Welding Process

    PubMed Central

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-01-01

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors—a microphone and piezoelectric—that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system’s high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries. PMID:28489045

  19. Discontinuity Detection in the Shield Metal Arc Welding Process.

    PubMed

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-05-10

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.

  20. Case Study: The Transformation of the Health Record; The Impact of Electronic Medical Records in a Military Treatment Facility

    DTIC Science & Technology

    2006-06-01

    technology , communication , and incremental and manageable deployment plans. Hospital Leadership Support is Essential AHLTA is supported by the senior leaders...Information Technology The importance of information technology and the desire to utilize it for improved health care outcomes is part of the NMCSD... technology applications had a direct positive impact on AHLTA’s deployment at NMCSD. Communication As previously discussed in the leadership

  1. Analysis of Particle Content of Recombinant Adeno-Associated Virus Serotype 8 Vectors by Ion-Exchange Chromatography

    PubMed Central

    Lock, Martin; Alvira, Mauricio R.

    2012-01-01

    Abstract Advances in adeno-associated virus (AAV)-mediated gene therapy have brought the possibility of commercial manufacturing of AAV vectors one step closer. To realize this prospect, a parallel effort with the goal of ever-increasing sophistication for AAV vector production technology and supporting assays will be required. Among the important release assays for a clinical gene therapy product, those monitoring potentially hazardous contaminants are most critical for patient safety. A prominent contaminant in many AAV vector preparations is vector particles lacking a genome, which can substantially increase the dose of AAV capsid proteins and lead to possible unwanted immunological consequences. Current methods to determine empty particle content suffer from inconsistency, are adversely affected by contaminants, or are not applicable to all serotypes. Here we describe the development of an ion-exchange chromatography-based assay that permits the rapid separation and relative quantification of AAV8 empty and full vector particles through the application of shallow gradients and a strong anion-exchange monolith chromatography medium. PMID:22428980

  2. Vector-borne diseases in Haiti: a review.

    PubMed

    Ben-Chetrit, Eli; Schwartz, Eli

    2015-01-01

    Haiti lies on the western third of the island of Hispaniola in the Caribbean, and is one of the poorest nations in the Western hemisphere. Haiti attracts a lot of medical attention and support due to severe natural disasters followed by disastrous health consequences. Vector-borne infections are still prevalent there with some unique aspects comparing it to Latin American countries and other Caribbean islands. Although vector-borne viral diseases such as dengue and recently chikungunya can be found in many of the Caribbean islands, including Haiti, there is an apparent distinction of the vector-borne parasitic diseases. Contrary to neighboring Carribbean islands, Haiti is highly endemic for malaria, lymphatic filariasis and mansonellosis. Affected by repeat natural disasters, poverty and lack of adequate infrastructure, control of transmission within Haiti and prevention of dissemination of vector-borne pathogens to other regions is challenging. In this review we summarize some aspects concerning diseases caused by vector-borne pathogens in Haiti. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A Shellcode Detection Method Based on Full Native API Sequence and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Cheng, Yixuan; Fan, Wenqing; Huang, Wei; An, Jing

    2017-09-01

    Dynamic monitoring the behavior of a program is widely used to discriminate between benign program and malware. It is usually based on the dynamic characteristics of a program, such as API call sequence or API call frequency to judge. The key innovation of this paper is to consider the full Native API sequence and use the support vector machine to detect the shellcode. We also use the Markov chain to extract and digitize Native API sequence features. Our experimental results show that the method proposed in this paper has high accuracy and low detection rate.

  4. Localization of U(1) gauge vector field on flat branes with five-dimension (asymptotic) AdS5 spacetime

    NASA Astrophysics Data System (ADS)

    Zhao, Zhen-Hua; Xie, Qun-Ying

    2018-05-01

    In order to localize U(1) gauge vector field on Randall-Sundrum-like braneworld model with infinite extra dimension, we propose a new kind of non-minimal coupling between the U(1) gauge field and the gravity. We propose three kinds of coupling methods and they all support the localization of zero mode. In addition, one of them can support the localization of massive modes. Moreover, the massive tachyonic modes can be excluded. And our method can be used not only in the thin braneword models but also in the thick ones.

  5. Support vector machine multiuser receiver for DS-CDMA signals in multipath channels.

    PubMed

    Chen, S; Samingan, A K; Hanzo, L

    2001-01-01

    The problem of constructing an adaptive multiuser detector (MUD) is considered for direct sequence code division multiple access (DS-CDMA) signals transmitted through multipath channels. The emerging learning technique, called support vector machines (SVM), is proposed as a method of obtaining a nonlinear MUD from a relatively small training data block. Computer simulation is used to study this SVM MUD, and the results show that it can closely match the performance of the optimal Bayesian one-shot detector. Comparisons with an adaptive radial basis function (RBF) MUD trained by an unsupervised clustering algorithm are discussed.

  6. Estimation of perceptible water vapor of atmosphere using artificial neural network, support vector machine and multiple linear regression algorithm and their comparative study

    NASA Astrophysics Data System (ADS)

    Shastri, Niket; Pathak, Kamlesh

    2018-05-01

    The water vapor content in atmosphere plays very important role in climate. In this paper the application of GPS signal in meteorology is discussed, which is useful technique that is used to estimate the perceptible water vapor of atmosphere. In this paper various algorithms like artificial neural network, support vector machine and multiple linear regression are use to predict perceptible water vapor. The comparative studies in terms of root mean square error and mean absolute errors are also carried out for all the algorithms.

  7. Implementation of support vector machine for classification of speech marked hijaiyah letters based on Mel frequency cepstrum coefficient feature extraction

    NASA Astrophysics Data System (ADS)

    Adhi Pradana, Wisnu; Adiwijaya; Novia Wisesty, Untari

    2018-03-01

    Support Vector Machine or commonly called SVM is one method that can be used to process the classification of a data. SVM classifies data from 2 different classes with hyperplane. In this study, the system was built using SVM to develop Arabic Speech Recognition. In the development of the system, there are 2 kinds of speakers that have been tested that is dependent speakers and independent speakers. The results from this system is an accuracy of 85.32% for speaker dependent and 61.16% for independent speakers.

  8. Developing an Initial Physical Function Item Bank from Existing Sources.

    ERIC Educational Resources Information Center

    Bode, Rita K.; Cella, David; Lai, Jin-shei; Heinemann, Allen W.

    2003-01-01

    Illustrates incremental item banking using health-related quality of life data collected from two samples of patients receiving cancer treatment (n=1,755 and n=1,544). Results support findings from previous studies that have equated separate instruments by co-calibrating their items. (SLD)

  9. Improved Design of Tunnel Supports : Volume 3 : Finite Element Analysis of the Peachtree Center Station in Atlanta

    DOT National Transportation Integrated Search

    1980-06-01

    Volume 3 contains the application of the three-dimensional (3-D) finite element program, Automatic Dynamic Incremental Nonlinear Analysis (ADINA), which was designed to replace the traditional 2-D plane strain analysis, to a specific location. The lo...

  10. Easy Attachment Of Panels To A Truss

    NASA Technical Reports Server (NTRS)

    Thomson, Mark; Gralewski, Mark

    1992-01-01

    Conceptual antenna dish, solar collector, or similar structure consists of hexagonal panels supported by truss erected in field. Truss built in increments to maintain access to panel-attachment nodes. Each panel brought toward truss at angle and attached to two nodes. Panel rotated into attachment at third node.

  11. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol

    PubMed Central

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-01-01

    Introduction There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Methods and analysis Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Discussion Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. PMID:27591026

  12. Economic evaluation of the artificial liver support system MARS in patients with acute-on-chronic liver failure

    PubMed Central

    Hessel, Franz P

    2006-01-01

    Background Acute-on-chronic liver failure (ACLF) is a life threatening acute decompensation of a pre-existing chronic liver disease. The artificial liver support system MARS is a new emerging therapeutic option possible to be implemented in routine care of these patients. The medical efficacy of MARS has been demonstrated in first clinical studies, but economic aspects have so far not been investigated. Objective of this study was to estimate the cost-effectiveness of MARS. Methods In a clinical cohort trial with a prospective follow-up of 3 years 33 ACLF-patients treated with MARS were compared to 46 controls. Survival, health-related quality of life as well as direct medical costs for in- and outpatient treatment from a health care system perspective were determined. Based on the differences in outcome and indirect costs the cost-effectiveness of MARS expressed as incremental costs per life year gained and incremental costs per QALY gained was estimated. Results The average initial intervention costs for MARS were 14600 EUR per patient treated. Direct medical costs over 3 years follow up were overall 40000 EUR per patient treated with MARS respectively 12700 EUR in controls. The 3 year survival rate after MARS was 52% compared to 17% in controls. Kaplan-Meier analysis of cumulated survival probability showed a highly significant difference in favour of MARS. Incremental costs per life-year gained were 31400 EUR; incremental costs per QALY gained were 47200 EUR. Conclusion The results after 3 years follow-up of the first economic evaluation study of MARS based on empirical patient data are presented. Although high initial treatment costs for MARS occur the significantly better survival seen in this study led to reasonable costs per live year gained. Further randomized controlled trials investigating the medical efficacy and the cost-effectiveness are recommended. PMID:17022815

  13. Cost effectiveness of pomalidomide in patients with relapsed and refractory multiple myeloma in Sweden.

    PubMed

    Borg, Sixten; Nahi, Hareth; Hansson, Markus; Lee, Dawn; Elvidge, Jamie; Persson, Ulf

    2016-05-01

    Multiple myeloma (MM) patients who have progressed following treatment with both bortezomib and lenalidomide have a poor prognosis. In this late stage, other effective alternatives are limited, and patients in Sweden are often left with best supportive care. Pomalidomide is a new anti-angiogenic and immunomodulatory drug for the treatment of MM. Our objective was to evaluate the cost effectiveness of pomalidomide as an add-on to best supportive care in patients with relapsed and refractory MM in Sweden. We developed a health-economic discrete event simulation model of a patient's course through stable disease and progressive disease, until death. It estimates life expectancy, quality-adjusted life years (QALYs) and costs from a societal perspective. Effectiveness data and utilities were taken from the MM-003 trial comparing pomalidomide plus low-dose dexamethasone with high-dose dexamethasone (HIDEX). Cost data were taken from official Swedish price lists, government sources and literature. The model estimates that, if a patient is treated with HIDEX, life expectancy is 1.12 years and the total cost is SEK 179 976 (€19 100), mainly indirect costs. With pomalidomide plus low-dose dexamethasone, life expectancy is 2.33 years, with a total cost of SEK 767 064 (€81 500), mainly in drug and indirect costs. Compared to HIDEX, pomalidomide treatment gives a QALY gain of 0.7351 and an incremental cost of SEK 587 088 (€62 400) consisting of increased drug costs (59%), incremental indirect costs (33%) and other healthcare costs (8%). The incremental cost-effectiveness ratio is SEK 798 613 (€84 900) per QALY gained. In a model of late-stage MM patients with a poor prognosis in the Swedish setting, pomalidomide is associated with a relatively high incremental cost per QALY gained. This model was accepted by the national Swedish reimbursement authority TLV, and pomalidomide was granted reimbursement in Sweden.

  14. Parametric and Nonparametric Statistical Methods for Genomic Selection of Traits with Additive and Epistatic Genetic Architectures

    PubMed Central

    Howard, Réka; Carriquiry, Alicia L.; Beavis, William D.

    2014-01-01

    Parametric and nonparametric methods have been developed for purposes of predicting phenotypes. These methods are based on retrospective analyses of empirical data consisting of genotypic and phenotypic scores. Recent reports have indicated that parametric methods are unable to predict phenotypes of traits with known epistatic genetic architectures. Herein, we review parametric methods including least squares regression, ridge regression, Bayesian ridge regression, least absolute shrinkage and selection operator (LASSO), Bayesian LASSO, best linear unbiased prediction (BLUP), Bayes A, Bayes B, Bayes C, and Bayes Cπ. We also review nonparametric methods including Nadaraya-Watson estimator, reproducing kernel Hilbert space, support vector machine regression, and neural networks. We assess the relative merits of these 14 methods in terms of accuracy and mean squared error (MSE) using simulated genetic architectures consisting of completely additive or two-way epistatic interactions in an F2 population derived from crosses of inbred lines. Each simulated genetic architecture explained either 30% or 70% of the phenotypic variability. The greatest impact on estimates of accuracy and MSE was due to genetic architecture. Parametric methods were unable to predict phenotypic values when the underlying genetic architecture was based entirely on epistasis. Parametric methods were slightly better than nonparametric methods for additive genetic architectures. Distinctions among parametric methods for additive genetic architectures were incremental. Heritability, i.e., proportion of phenotypic variability, had the second greatest impact on estimates of accuracy and MSE. PMID:24727289

  15. Energy-exchange collisions of dark-bright-bright vector solitons.

    PubMed

    Radhakrishnan, R; Manikandan, N; Aravinthan, K

    2015-12-01

    We find a dark component guiding the practically interesting bright-bright vector one-soliton to two different parametric domains giving rise to different physical situations by constructing a more general form of three-component dark-bright-bright mixed vector one-soliton solution of the generalized Manakov model with nine free real parameters. Moreover our main investigation of the collision dynamics of such mixed vector solitons by constructing the multisoliton solution of the generalized Manakov model with the help of Hirota technique reveals that the dark-bright-bright vector two-soliton supports energy-exchange collision dynamics. In particular the dark component preserves its initial form and the energy-exchange collision property of the bright-bright vector two-soliton solution of the Manakov model during collision. In addition the interactions between bound state dark-bright-bright vector solitons reveal oscillations in their amplitudes. A similar kind of breathing effect was also experimentally observed in the Bose-Einstein condensates. Some possible ways are theoretically suggested not only to control this breathing effect but also to manage the beating, bouncing, jumping, and attraction effects in the collision dynamics of dark-bright-bright vector solitons. The role of multiple free parameters in our solution is examined to define polarization vector, envelope speed, envelope width, envelope amplitude, grayness, and complex modulation of our solution. It is interesting to note that the polarization vector of our mixed vector one-soliton evolves in sphere or hyperboloid depending upon the initial parametric choices.

  16. Using an object-based grid system to evaluate a newly developed EP approach to formulate SVMs as applied to the classification of organophosphate nerve agents

    NASA Astrophysics Data System (ADS)

    Land, Walker H., Jr.; Lewis, Michael; Sadik, Omowunmi; Wong, Lut; Wanekaya, Adam; Gonzalez, Richard J.; Balan, Arun

    2004-04-01

    This paper extends the classification approaches described in reference [1] in the following way: (1.) developing and evaluating a new method for evolving organophosphate nerve agent Support Vector Machine (SVM) classifiers using Evolutionary Programming, (2.) conducting research experiments using a larger database of organophosphate nerve agents, and (3.) upgrading the architecture to an object-based grid system for evaluating the classification of EP derived SVMs. Due to the increased threats of chemical and biological weapons of mass destruction (WMD) by international terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat biochemical warfare. This paper reports the integration of multi-array sensors with Support Vector Machines (SVMs) for the detection of organophosphates nerve agents using a grid computing system called Legion. Grid computing is the use of large collections of heterogeneous, distributed resources (including machines, databases, devices, and users) to support large-scale computations and wide-area data access. Finally, preliminary results using EP derived support vector machines designed to operate on distributed systems have provided accurate classification results. In addition, distributed training time architectures are 50 times faster when compared to standard iterative training time methods.

  17. Detection of Hard Exudates in Colour Fundus Images Using Fuzzy Support Vector Machine-Based Expert System.

    PubMed

    Jaya, T; Dheeba, J; Singh, N Albert

    2015-12-01

    Diabetic retinopathy is a major cause of vision loss in diabetic patients. Currently, there is a need for making decisions using intelligent computer algorithms when screening a large volume of data. This paper presents an expert decision-making system designed using a fuzzy support vector machine (FSVM) classifier to detect hard exudates in fundus images. The optic discs in the colour fundus images are segmented to avoid false alarms using morphological operations and based on circular Hough transform. To discriminate between the exudates and the non-exudates pixels, colour and texture features are extracted from the images. These features are given as input to the FSVM classifier. The classifier analysed 200 retinal images collected from diabetic retinopathy screening programmes. The tests made on the retinal images show that the proposed detection system has better discriminating power than the conventional support vector machine. With the best combination of FSVM and features sets, the area under the receiver operating characteristic curve reached 0.9606, which corresponds to a sensitivity of 94.1% with a specificity of 90.0%. The results suggest that detecting hard exudates using FSVM contribute to computer-assisted detection of diabetic retinopathy and as a decision support system for ophthalmologists.

  18. Postgenomic approaches to using corynebacteria as biocatalysts.

    PubMed

    Vertès, Alain A; Inui, Masayuki; Yukawa, Hideaki

    2012-01-01

    Corynebacterium glutamicum exhibits numerous ideal intrinsic attributes as a factory of primary and secondary metabolites. The versatile capabilities of this organism have long been implemented at the industrial scale to produce an array of amino acids at high yields and conversion rates, thereby enabling the development of an entire industry. The postgenomic era provides a new technological platform not only to further optimize the intrinsic attributes of C. glutamicum whole cells as biocatalysts, but also to dramatically expand the product portfolio that can be manufactured by this organism, from amino acids to commodity chemicals. This review addresses the methods and strain optimization strategies enabled by genomic information and associated techniques. Their implementation has provided important additional incremental improvements to the economics of industry-scale manufacturing in which C. glutamicum and its episomal elements are used as a performing host-vector system.

  19. Laser Anemometer Measurements of the Three-Dimensional Rotor Flow Field in the NASA Low-Speed Centrifugal Compressor

    NASA Technical Reports Server (NTRS)

    Hathaway, Michael D.; Chriss, Randall M.; Strazisar, Anthony J.; Wood, Jerry R.

    1995-01-01

    A laser anemometer system was used to provide detailed surveys of the three-dimensional velocity field within the NASA low-speed centrifugal impeller operating with a vaneless diffuser. Both laser anemometer and aerodynamic performance data were acquired at the design flow rate and at a lower flow rate. Floor path coordinates, detailed blade geometry, and pneumatic probe survey results are presented in tabular form. The laser anemometer data are presented in the form of pitchwise distributions of axial, radial, and relative tangential velocity on blade-to-blade stream surfaces at 5-percent-of-span increments, starting at 95-percent-of-span from the hub. The laser anemometer data are also presented as contour and wire-frame plots of throughflow velocity and vector plots of secondary velocities at all measurement stations through the impeller.

  20. The Evolutionary History and Spatiotemporal Dynamics of the NC Lineage of Citrus Tristeza Virus.

    PubMed

    Benítez-Galeano, María José; Castells, Matías; Colina, Rodney

    2017-10-12

    Citrus tristeza virus (CTV) is a major pathogen affecting citrus trees worldwide. However, few studies have focused on CTV's evolutionary history and geographic behavior. CTV is locally dispersed by an aphid vector and long distance dispersion due to transportation of contaminated material. With the aim to delve deeper into the CTV-NC (New Clade) genotype evolution, we estimated an evolution rate of 1.19 × 10 -3 subs/site/year and the most common recent ancestor in 1977. Furthermore, the place of origin of the genotype was in the United States, and a great expansion of the population was observed in Uruguay. This expansion phase could be a consequence of the increment in the number of naïve citrus trees in Uruguayan orchards encompassing citrus industry growth in the past years.

  1. Efficiency Improvement of Action Acquisition in Two-Link Robot Arm Using Fuzzy ART with Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Kotani, Naoki; Taniguchi, Kenji

    An efficient learning method using Fuzzy ART with Genetic Algorithm is proposed. The proposed method reduces the number of trials by using a policy acquired in other tasks because a reinforcement learning needs a lot of the number of trials until an agent acquires appropriate actions. Fuzzy ART is an incremental unsupervised learning algorithm in responce to arbitrary sequences of analog or binary input vectors. Our proposed method gives a policy by crossover or mutation when an agent observes unknown states. Selection controls the category proliferation problem of Fuzzy ART. The effectiveness of the proposed method was verified with the simulation of the reaching problem for the two-link robot arm. The proposed method achieves a reduction of both the number of trials and the number of states.

  2. Validity and Reproducibility of an Incremental Sit-To-Stand Exercise Test for Evaluating Anaerobic Threshold in Young, Healthy Individuals.

    PubMed

    Nakamura, Keisuke; Ohira, Masayoshi; Yokokawa, Yoshiharu; Nagasawa, Yuya

    2015-12-01

    Sit-to-stand exercise (STS) is a common activity of daily living. The objectives of the present study were: 1) to assess the validity of aerobic fitness measurements based on anaerobic thresholds (ATs), during incremental sit-to-stand exercise (ISTS) with and without arm support compared with an incremental cycle-ergometer (CE) test; and 2) to examine the reproducibility of the AT measured during the ISTSs. Twenty-six healthy individuals randomly performed the ISTS and CE test. Oxygen uptakes at the AT (AT-VO2) and heart rate at the AT (AT-HR) were determined during the ISTSs and CE test, and repeated-measures analyses of variance and Tukey's post-hoc test were used to evaluate the differences between these variables. Pearson correlation coefficients were used to assess the strength of the relationship between AT-VO2 and AT-HR during the ISTSs and CE test. Data analysis yielded the following correlations: AT-VO2 during the ISTS with arm support and the CE test, r = 0.77 (p < 0.05); AT-VO2 during the ISTS without arm support and the CE test, r = 0.70 (p < 0.05); AT-HR during the ISTS with arm support and the CE test, r = 0.80 (p < 0.05); and AT-HR during the ISTS without arm support and the CE test, r = 0.66 (p < 0.05). The AT-VO2 values during the ISTS with arm support (18.5 ± 1.9 mL·min(-1)·kg(-1)) and the CE test (18.4 ± 1.8 mL·min(-1)·kg(-1)) were significantly higher than those during the ISTS without arm support (16.6 ± 1.8 mL·min(-1)·kg(-1); p < 0.05). The AT-HR values during the ISTS with arm support (126 ± 10 bpm) and the CE test (126 ± 13 bpm) were significantly higher than those during the ISTS without arm support (119 ± 9 bpm; p < 0.05). The ISTS with arm support may provide a cardiopulmonary function load equivalent to the CE test; therefore, it is a potentially valid test for evaluating AT-VO2 and AT-HR in healthy, young adults. Key pointsThe ISTS is a simple test that varies only according to the frequency of standing up, and requires only a small space and a chair.The ISTS with arm support is valid and reproducible, and is a safe test for evaluating AT in healthy young adults.For evaluating the AT, the ISTS may serve as a valid alternative to conventional CPX, using either a cycle ergometer or treadmill, in cases where the latter methods are difficult to implement.

  3. Repeat Transduction in the Mouse Lung by Using Adeno-Associated Virus Vectors with Different Serotypes

    PubMed Central

    Halbert, Christine L.; Rutledge, Elizabeth A.; Allen, James M.; Russell, David W.; Miller, A. Dusty

    2000-01-01

    Vectors derived from adeno-associated virus type 2 (AAV2) promote gene transfer and expression in the lung; however, we have found that while gene expression can persist for at least 8 months in mice, it was reduced dramatically in rabbits over a period of 2 months. The efficiency and persistence of AAV2-mediated gene expression in the human lung have yet to be determined, but it seems likely that readministration will be necessary over the lifetime of an individual. Unfortunately, we have found that transduction by a second administration of an AAV2 vector is blocked, presumably due to neutralizing antibodies generated in response to the primary vector exposure. Here, we have explored the use of AAV2 vectors pseudotyped with capsid proteins from AAV serotypes 2, 3, and 6 for readministration in the mouse lung. We found that an AAV6 vector transduced airway epithelial and alveolar cells in the lung at rates that were at least as high as those of AAV2 pseudotype vectors, while transduction rates mediated by AAV3 were much lower. AAV6 pseudotype vector transduction was unaffected by prior administration of an AAV2 or AAV3 vector, and transduction by an AAV2 pseudotype vector was unaffected by prior AAV6 vector administration, showing that cross-reactive neutralizing antibodies against AAV2 and AAV6 are not generated in mice. Interestingly, while prior administration of an AAV2 vector completely blocked transduction by a second AAV2 pseudotype vector, prior administration of an AAV6 vector only partially inhibited transduction by a second administration of an AAV6 pseudotype vector. Analysis of sera obtained from mice and humans showed that AAV6 is less immunogenic than AAV2, which helps explain this finding. These results support the development of AAV6 vectors for lung gene therapy both alone and in combination with AAV2 vectors. PMID:10627564

  4. Ontology for Vector Surveillance and Management

    PubMed Central

    LOZANO-FUENTES, SAUL; BANDYOPADHYAY, ARITRA; COWELL, LINDSAY G.; GOLDFAIN, ALBERT; EISEN, LARS

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an “umbrella” for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a “term tree” to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has_vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO *.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use. PMID:23427646

  5. Ontology for vector surveillance and management.

    PubMed

    Lozano-Fuentes, Saul; Bandyopadhyay, Aritra; Cowell, Lindsay G; Goldfain, Albert; Eisen, Lars

    2013-01-01

    Ontologies, which are made up by standardized and defined controlled vocabulary terms and their interrelationships, are comprehensive and readily searchable repositories for knowledge in a given domain. The Open Biomedical Ontologies (OBO) Foundry was initiated in 2001 with the aims of becoming an "umbrella" for life-science ontologies and promoting the use of ontology development best practices. A software application (OBO-Edit; *.obo file format) was developed to facilitate ontology development and editing. The OBO Foundry now comprises over 100 ontologies and candidate ontologies, including the NCBI organismal classification ontology (NCBITaxon), the Mosquito Insecticide Resistance Ontology (MIRO), the Infectious Disease Ontology (IDO), the IDOMAL malaria ontology, and ontologies for mosquito gross anatomy and tick gross anatomy. We previously developed a disease data management system for dengue and malaria control programs, which incorporated a set of information trees built upon ontological principles, including a "term tree" to promote the use of standardized terms. In the course of doing so, we realized that there were substantial gaps in existing ontologies with regards to concepts, processes, and, especially, physical entities (e.g., vector species, pathogen species, and vector surveillance and management equipment) in the domain of surveillance and management of vectors and vector-borne pathogens. We therefore produced an ontology for vector surveillance and management, focusing on arthropod vectors and vector-borne pathogens with relevance to humans or domestic animals, and with special emphasis on content to support operational activities through inclusion in databases, data management systems, or decision support systems. The Vector Surveillance and Management Ontology (VSMO) includes >2,200 unique terms, of which the vast majority (>80%) were newly generated during the development of this ontology. One core feature of the VSMO is the linkage, through the has vector relation, of arthropod species to the pathogenic microorganisms for which they serve as biological vectors. We also recognized and addressed a potential roadblock for use of the VSMO by the vector-borne disease community: the difficulty in extracting information from OBO-Edit ontology files (*.obo files) and exporting the information to other file formats. A novel ontology explorer tool was developed to facilitate extraction and export of information from the VSMO*.obo file into lists of terms and their associated unique IDs in *.txt or *.csv file formats. These lists can then be imported into a database or data management system for use as select lists with predefined terms. This is an important step to ensure that the knowledge contained in our ontology can be put into practical use.

  6. Calibration Test Set for a Phase-Comparison Digital Tracker

    NASA Technical Reports Server (NTRS)

    Boas, Amy; Li, Samuel; McMaster, Robert

    2007-01-01

    An apparatus that generates four signals at a frequency of 7.1 GHz having precisely controlled relative phases and equal amplitudes has been designed and built. This apparatus is intended mainly for use in computer-controlled automated calibration and testing of a phase-comparison digital tracker (PCDT) that measures the relative phases of replicas of the same X-band signal received by four antenna elements in an array. (The relative direction of incidence of the signal on the array is then computed from the relative phases.) The present apparatus can also be used to generate precisely phased signals for steering a beam transmitted from a phased antenna array. The apparatus (see figure) includes a 7.1-GHz signal generator, the output of which is fed to a four-way splitter. Each of the four splitter outputs is attenuated by 10 dB and fed as input to a vector modulator, wherein DC bias voltages are used to control the in-phase (I) and quadrature (Q) signal components. The bias voltages are generated by digital-to-analog- converter circuits on a control board that receives its digital control input from a computer running a LabVIEW program. The outputs of the vector modulators are further attenuated by 10 dB, then presented at high-grade radio-frequency connectors. The attenuation reduces the effects of changing mismatch and reflections. The apparatus was calibrated in a process in which the bias voltages were first stepped through all possible IQ settings. Then in a reverse interpolation performed by use of MATLAB software, a lookup table containing 3,600 IQ settings, representing equal amplitude and phase increments of 0.1 , was created for each vector modulator. During operation of the apparatus, these lookup tables are used in calibrating the PCDT.

  7. Do other-reports of counterproductive work behavior provide an incremental contribution over self-reports? A meta-analytic comparison.

    PubMed

    Berry, Christopher M; Carpenter, Nichelle C; Barratt, Clare L

    2012-05-01

    Much of the recent research on counterproductive work behaviors (CWBs) has used multi-item self-report measures of CWB. Because of concerns over self-report measurement, there have been recent calls to collect ratings of employees' CWB from their supervisors or coworkers (i.e., other-raters) as alternatives or supplements to self-ratings. However, little is still known about the degree to which other-ratings of CWB capture unique and valid incremental variance beyond self-report CWB. The present meta-analysis investigates a number of key issues regarding the incremental contribution of other-reports of CWB. First, self- and other-ratings of CWB were moderately to strongly correlated with each other. Second, with some notable exceptions, self- and other-report CWB exhibited very similar patterns and magnitudes of relationships with a set of common correlates. Third, self-raters reported engaging in more CWB than other-raters reported them engaging in, suggesting other-ratings capture a narrower subset of CWBs. Fourth, other-report CWB generally accounted for little incremental variance in the common correlates beyond self-report CWB. Although many have viewed self-reports of CWB with skepticism, the results of this meta-analysis support their use in most CWB research as a viable alternative to other-reports. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  8. Spacecraft Fire Safety Demonstration

    NASA Technical Reports Server (NTRS)

    Urban, David L.; Ruff, Gary A.

    2016-01-01

    A presentation of the Saffire Experiment goals and scientific objectives for the Joint CSA/ESA/JAXA/NASA Increments 47 and 48 Science Symposium. The purpose of the presentation is to inform the ISS Cadre and the other investigators of the Saffire goals and objectives to enable them to best support a successful Saffire outcome.

  9. The Economics of Time in Learning.

    ERIC Educational Resources Information Center

    Christoffersson, Nils-Olaf

    The use of a mathematical model supported by empirical findings had developed a method of cost effectiveness that can be used in evaluations between educational objectives and goals. Educational time allocation can be studied and developed into a micro-level economic theory of decision. Learning has been defined as increments which can be…

  10. Operation and Maintenance Support Information (OMSI) Creation, Management, and Repurposing With XML

    DTIC Science & Technology

    2004-09-01

    engines that cost tens of thousands of dollars. There are many middleware applications on the commercial and open-source market . The “Big Four......planners can begin an incremental planning effort early in the facility construction phase. This thesis provides a non-proprietary, no- cost solution to

  11. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    PubMed

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  12. Support vector machine regression (SVR/LS-SVM)--an alternative to neural networks (ANN) for analytical chemistry? Comparison of nonlinear methods on near infrared (NIR) spectroscopy data.

    PubMed

    Balabin, Roman M; Lomakina, Ekaterina I

    2011-04-21

    In this study, we make a general comparison of the accuracy and robustness of five multivariate calibration models: partial least squares (PLS) regression or projection to latent structures, polynomial partial least squares (Poly-PLS) regression, artificial neural networks (ANNs), and two novel techniques based on support vector machines (SVMs) for multivariate data analysis: support vector regression (SVR) and least-squares support vector machines (LS-SVMs). The comparison is based on fourteen (14) different datasets: seven sets of gasoline data (density, benzene content, and fractional composition/boiling points), two sets of ethanol gasoline fuel data (density and ethanol content), one set of diesel fuel data (total sulfur content), three sets of petroleum (crude oil) macromolecules data (weight percentages of asphaltenes, resins, and paraffins), and one set of petroleum resins data (resins content). Vibrational (near-infrared, NIR) spectroscopic data are used to predict the properties and quality coefficients of gasoline, biofuel/biodiesel, diesel fuel, and other samples of interest. The four systems presented here range greatly in composition, properties, strength of intermolecular interactions (e.g., van der Waals forces, H-bonds), colloid structure, and phase behavior. Due to the high diversity of chemical systems studied, general conclusions about SVM regression methods can be made. We try to answer the following question: to what extent can SVM-based techniques replace ANN-based approaches in real-world (industrial/scientific) applications? The results show that both SVR and LS-SVM methods are comparable to ANNs in accuracy. Due to the much higher robustness of the former, the SVM-based approaches are recommended for practical (industrial) application. This has been shown to be especially true for complicated, highly nonlinear objects.

  13. Snack food as a modulator of human resting-state functional connectivity.

    PubMed

    Mendez-Torrijos, Andrea; Kreitz, Silke; Ivan, Claudiu; Konerth, Laura; Rösch, Julie; Pischetsrieder, Monika; Moll, Gunther; Kratz, Oliver; Dörfler, Arnd; Horndasch, Stefanie; Hess, Andreas

    2018-04-04

    To elucidate the mechanisms of how snack foods may induce non-homeostatic food intake, we used resting state functional magnetic resonance imaging (fMRI), as resting state networks can individually adapt to experience after short time exposures. In addition, we used graph theoretical analysis together with machine learning techniques (support vector machine) to identifying biomarkers that can categorize between high-caloric (potato chips) vs. low-caloric (zucchini) food stimulation. Seventeen healthy human subjects with body mass index (BMI) 19 to 27 underwent 2 different fMRI sessions where an initial resting state scan was acquired, followed by visual presentation of different images of potato chips and zucchini. There was then a 5-minute pause to ingest food (day 1=potato chips, day 3=zucchini), followed by a second resting state scan. fMRI data were further analyzed using graph theory analysis and support vector machine techniques. Potato chips vs. zucchini stimulation led to significant connectivity changes. The support vector machine was able to accurately categorize the 2 types of food stimuli with 100% accuracy. Visual, auditory, and somatosensory structures, as well as thalamus, insula, and basal ganglia were found to be important for food classification. After potato chips consumption, the BMI was associated with the path length and degree in nucleus accumbens, middle temporal gyrus, and thalamus. The results suggest that high vs. low caloric food stimulation in healthy individuals can induce significant changes in resting state networks. These changes can be detected using graph theory measures in conjunction with support vector machine. Additionally, we found that the BMI affects the response of the nucleus accumbens when high caloric food is consumed.

  14. A multiple kernel support vector machine scheme for feature selection and rule extraction from gene expression data of cancer tissue.

    PubMed

    Chen, Zhenyu; Li, Jianping; Wei, Liwei

    2007-10-01

    Recently, gene expression profiling using microarray techniques has been shown as a promising tool to improve the diagnosis and treatment of cancer. Gene expression data contain high level of noise and the overwhelming number of genes relative to the number of available samples. It brings out a great challenge for machine learning and statistic techniques. Support vector machine (SVM) has been successfully used to classify gene expression data of cancer tissue. In the medical field, it is crucial to deliver the user a transparent decision process. How to explain the computed solutions and present the extracted knowledge becomes a main obstacle for SVM. A multiple kernel support vector machine (MK-SVM) scheme, consisting of feature selection, rule extraction and prediction modeling is proposed to improve the explanation capacity of SVM. In this scheme, we show that the feature selection problem can be translated into an ordinary multiple parameters learning problem. And a shrinkage approach: 1-norm based linear programming is proposed to obtain the sparse parameters and the corresponding selected features. We propose a novel rule extraction approach using the information provided by the separating hyperplane and support vectors to improve the generalization capacity and comprehensibility of rules and reduce the computational complexity. Two public gene expression datasets: leukemia dataset and colon tumor dataset are used to demonstrate the performance of this approach. Using the small number of selected genes, MK-SVM achieves encouraging classification accuracy: more than 90% for both two datasets. Moreover, very simple rules with linguist labels are extracted. The rule sets have high diagnostic power because of their good classification performance.

  15. Extraction and classification of 3D objects from volumetric CT data

    NASA Astrophysics Data System (ADS)

    Song, Samuel M.; Kwon, Junghyun; Ely, Austin; Enyeart, John; Johnson, Chad; Lee, Jongkyu; Kim, Namho; Boyd, Douglas P.

    2016-05-01

    We propose an Automatic Threat Detection (ATD) algorithm for Explosive Detection System (EDS) using our multistage Segmentation Carving (SC) followed by Support Vector Machine (SVM) classifier. The multi-stage Segmentation and Carving (SC) step extracts all suspect 3-D objects. The feature vector is then constructed for all extracted objects and the feature vector is classified by the Support Vector Machine (SVM) previously learned using a set of ground truth threat and benign objects. The learned SVM classifier has shown to be effective in classification of different types of threat materials. The proposed ATD algorithm robustly deals with CT data that are prone to artifacts due to scatter, beam hardening as well as other systematic idiosyncrasies of the CT data. Furthermore, the proposed ATD algorithm is amenable for including newly emerging threat materials as well as for accommodating data from newly developing sensor technologies. Efficacy of the proposed ATD algorithm with the SVM classifier is demonstrated by the Receiver Operating Characteristics (ROC) curve that relates Probability of Detection (PD) as a function of Probability of False Alarm (PFA). The tests performed using CT data of passenger bags shows excellent performance characteristics.

  16. Enlargement and contracture of C2-ceramide channels.

    PubMed

    Siskind, Leah J; Davoody, Amirparviz; Lewin, Naomi; Marshall, Stephanie; Colombini, Marco

    2003-09-01

    Ceramides are known to play a major regulatory role in apoptosis by inducing cytochrome c release from mitochondria. We have previously reported that ceramide, but not dihydroceramide, forms large and stable channels in phospholipid membranes and outer membranes of isolated mitochondria. C(2)-ceramide channel formation is characterized by conductance increments ranging from <1 to >200 nS. These conductance increments often represent the enlargement and contracture of channels rather than the opening and closure of independent channels. Enlargement is supported by the observation that many small conductance increments can lead to a large decrement. Also the initial conductances favor cations, but this selectivity drops dramatically with increasing total conductance. La(+3) causes rapid ceramide channel disassembly in a manner indicative of large conducting structures. These channels have a propensity to contract by a defined size (often multiples of 4 nS) indicating the formation of cylindrical channels with preferred diameters rather than a continuum of sizes. The results are consistent with ceramides forming barrel-stave channels whose size can change by loss or insertion of multiple ceramide columns.

  17. Predicting recidivism with the psychological inventory of criminal thinking styles and level of service inventory-revised: screening version.

    PubMed

    Walters, Glenn D

    2011-06-01

    Recidivism was evaluated in 178 male inmates administered the Psychological Inventory of Criminal Thinking Styles (PICTS) and scored on the Level of Service Inventory-Revised: Screening Version (LSI-R:SV) 1-55 months before their release from prison. Age, prior charges, the LSI-R:SV total score, and the PICTS General Criminal Thinking (GCT), Proactive Criminal Thinking (P), and Reactive Criminal Thinking (R) scores served as predictors of recidivism in follow-ups spanning 1-53 months. Age, prior charges, and the PICTS GCT and R scales consistently and incrementally predicted general recidivism (all charges), whereas prior charges and the PICTS R scale consistently and incrementally predicted serious recidivism (more serious charges). Although these results support the predictive efficacy and incremental validity of content-relevant self-report measures of criminality like the PICTS, they also indicate that the effect is modest and in need of further clarification. One area requiring further investigation is the potential role of the PICTS, particularly the R scale, as a dynamic risk factor.

  18. Enlargement and Contracture of C2-Ceramide Channels

    PubMed Central

    Siskind, Leah J.; Davoody, Amirparviz; Lewin, Naomi; Marshall, Stephanie; Colombini, Marco

    2003-01-01

    Ceramides are known to play a major regulatory role in apoptosis by inducing cytochrome c release from mitochondria. We have previously reported that ceramide, but not dihydroceramide, forms large and stable channels in phospholipid membranes and outer membranes of isolated mitochondria. C2-ceramide channel formation is characterized by conductance increments ranging from <1 to >200 nS. These conductance increments often represent the enlargement and contracture of channels rather than the opening and closure of independent channels. Enlargement is supported by the observation that many small conductance increments can lead to a large decrement. Also the initial conductances favor cations, but this selectivity drops dramatically with increasing total conductance. La+3 causes rapid ceramide channel disassembly in a manner indicative of large conducting structures. These channels have a propensity to contract by a defined size (often multiples of 4 nS) indicating the formation of cylindrical channels with preferred diameters rather than a continuum of sizes. The results are consistent with ceramides forming barrel-stave channels whose size can change by loss or insertion of multiple ceramide columns. PMID:12944273

  19. Dual tuning in creative processes: Joint contributions of intrinsic and extrinsic motivational orientations.

    PubMed

    Gong, Yaping; Wu, Junfeng; Song, Lynda Jiwen; Zhang, Zhen

    2017-05-01

    Intrinsic and extrinsic motivational orientations often coexist and can serve important functions. We develop and test a model in which intrinsic and extrinsic motivational orientations interact positively to influence personal creativity goal. Personal creativity goal, in turn, has a positive relationship with incremental creativity and an inverted U-shaped relationship with radical creativity. In a pilot study, we validated the personal creativity goal measure using 180 (Sample 1) and 69 (Sample 2) employees from a consulting firm. In the primary study, we tested the overall model using a sample of 657 research and development employees and their direct supervisors from an automobile firm. The results support the hypothesized model and yield several new insights. Intrinsic and extrinsic motivational orientations synergize with each other to strengthen personal creativity goal. Personal creativity goal in turn benefits incremental and radical creativity, but only up to a certain point for the latter. In addition to its linear indirect relationship with incremental creativity, intrinsic motivational orientation has an inverted U-shaped indirect relationship with radical creativity via personal creativity goal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Navy Civilian (Civil Service) Billet Costs--FY 1981.

    DTIC Science & Technology

    1981-07-01

    REPAIRItEN-- WACE LFADER (WE) (UNDISCOUNTED) INITIAL ANNUALLY YEARS IN LIFE CYCLE BILLET RECURRING GRD COST COST 1 5 10 15 20 5 318 32611 31013 128594 207215...and supporting each billet were established and are presented in 1-, 5-, 10-, 15-, and 20- year increments. DDI j ANඑ 1473 EDITION OF I OV...PAGSVfa Des ENotom.) FOREWORD This effort was conducted in support of Navy Decision Coordinating Paper ZI 170-PN, subproject Z1170-PN.05 (Reducing

  1. Assembly and Functional Analysis of an S/MAR Based Episome with the Cystic Fibrosis Transmembrane Conductance Regulator Gene.

    PubMed

    De Rocco, Davide; Pompili, Barbara; Castellani, Stefano; Morini, Elena; Cavinato, Luca; Cimino, Giuseppe; Mariggiò, Maria A; Guarnieri, Simone; Conese, Massimo; Del Porto, Paola; Ascenzioni, Fiorentina

    2018-04-17

    Improving the efficacy of gene therapy vectors is still an important goal toward the development of safe and efficient gene therapy treatments. S/MAR (scaffold/matrix attached region)-based vectors are maintained extra-chromosomally in numerous cell types, which is similar to viral-based vectors. Additionally, when established as an episome, they show a very high mitotic stability. In the present study we tested the idea that addition of an S/MAR element to a CFTR (cystic fibrosis transmembrane conductance regulator) expression vector, may allow the establishment of a CFTR episome in bronchial epithelial cells. Starting from the observation that the S/MAR vector pEPI-EGFP (enhanced green fluorescence protein) is maintained as an episome in human bronchial epithelial cells, we assembled the CFTR vector pBQ-S/MAR. This vector, transfected in bronchial epithelial cells with mutated CFTR , supported long term wt CFTR expression and activity, which in turn positively impacted on the assembly of tight junctions in polarized epithelial cells. Additionally, the recovery of intact pBQ-S/MAR, but not the parental vector lacking the S/MAR element, from transfected cells after extensive proliferation, strongly suggested that pBQ-S/MAR was established as an episome. These results add a new element, the S/MAR, that can be considered to improve the persistence and safety of gene therapy vectors for cystic fibrosis pulmonary disease.

  2. Aedes hensilli as a Potential Vector of Chikungunya and Zika Viruses

    PubMed Central

    Ledermann, Jeremy P.; Guillaumot, Laurent; Yug, Lawrence; Saweyog, Steven C.; Tided, Mary; Machieng, Paul; Pretrick, Moses; Marfel, Maria; Griggs, Anne; Bel, Martin; Duffy, Mark R.; Hancock, W. Thane; Ho-Chen, Tai; Powers, Ann M.

    2014-01-01

    An epidemic of Zika virus (ZIKV) illness that occurred in July 2007 on Yap Island in the Federated States of Micronesia prompted entomological studies to identify both the primary vector(s) involved in transmission and the ecological parameters contributing to the outbreak. Larval and pupal surveys were performed to identify the major containers serving as oviposition habitat for the likely vector(s). Adult mosquitoes were also collected by backpack aspiration, light trap, and gravid traps at select sites around the capital city. The predominant species found on the island was Aedes (Stegomyia) hensilli. No virus isolates were obtained from the adult field material collected, nor did any of the immature mosquitoes that were allowed to emerge to adulthood contain viable virus or nucleic acid. Therefore, laboratory studies of the probable vector, Ae. hensilli, were undertaken to determine the likelihood of this species serving as a vector for Zika virus and other arboviruses. Infection rates of up to 86%, 62%, and 20% and dissemination rates of 23%, 80%, and 17% for Zika, chikungunya, and dengue-2 viruses respectively, were found supporting the possibility that this species served as a vector during the Zika outbreak and that it could play a role in transmitting other medically important arboviruses. PMID:25299181

  3. Building a computer program to support children, parents, and distraction during healthcare procedures.

    PubMed

    Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L

    2012-10-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.

  4. The maximum vector-angular margin classifier and its fast training on large datasets using a core vector machine.

    PubMed

    Hu, Wenjun; Chung, Fu-Lai; Wang, Shitong

    2012-03-01

    Although pattern classification has been extensively studied in the past decades, how to effectively solve the corresponding training on large datasets is a problem that still requires particular attention. Many kernelized classification methods, such as SVM and SVDD, can be formulated as the corresponding quadratic programming (QP) problems, but computing the associated kernel matrices requires O(n2)(or even up to O(n3)) computational complexity, where n is the size of the training patterns, which heavily limits the applicability of these methods for large datasets. In this paper, a new classification method called the maximum vector-angular margin classifier (MAMC) is first proposed based on the vector-angular margin to find an optimal vector c in the pattern feature space, and all the testing patterns can be classified in terms of the maximum vector-angular margin ρ, between the vector c and all the training data points. Accordingly, it is proved that the kernelized MAMC can be equivalently formulated as the kernelized Minimum Enclosing Ball (MEB), which leads to a distinctive merit of MAMC, i.e., it has the flexibility of controlling the sum of support vectors like v-SVC and may be extended to a maximum vector-angular margin core vector machine (MAMCVM) by connecting the core vector machine (CVM) method with MAMC such that the corresponding fast training on large datasets can be effectively achieved. Experimental results on artificial and real datasets are provided to validate the power of the proposed methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. What is the risk for exposure to vector-borne pathogens in United States national parks?

    PubMed

    Eisen, Lars; Wong, David; Shelus, Victoria; Eisen, Rebecca J

    2013-03-01

    United States national parks attract > 275 million visitors annually and collectively present risk of exposure for staff and visitors to a wide range of arthropod vector species (most notably fleas, mosquitoes, and ticks) and their associated bacterial, protozoan, or viral pathogens. We assessed the current state of knowledge for risk of exposure to vector-borne pathogens in national parks through a review of relevant literature, including internal National Park Service documents and organismal databases. We conclude that, because of lack of systematic surveillance for vector-borne pathogens in national parks, the risk of pathogen exposure for staff and visitors is unclear. Existing data for vectors within national parks were not based on systematic collections and rarely include evaluation for pathogen infection. Extrapolation of human-based surveillance data from neighboring communities likely provides inaccurate estimates for national parks because landscape differences impact transmission of vector-borne pathogens and human-vector contact rates likely differ inside versus outside the parks because of differences in activities or behaviors. Vector-based pathogen surveillance holds promise to define when and where within national parks the risk of exposure to infected vectors is elevated. A pilot effort, including 5-10 strategic national parks, would greatly improve our understanding of the scope and magnitude of vector-borne pathogen transmission in these high-use public settings. Such efforts also will support messaging to promote personal protection measures and inform park visitors and staff of their responsibility for personal protection, which the National Park Service preservation mission dictates as the core strategy to reduce exposure to vector-borne pathogens in national parks.

  6. Aircraft Engine Thrust Estimator Design Based on GSA-LSSVM

    NASA Astrophysics Data System (ADS)

    Sheng, Hanlin; Zhang, Tianhong

    2017-08-01

    In view of the necessity of highly precise and reliable thrust estimator to achieve direct thrust control of aircraft engine, based on support vector regression (SVR), as well as least square support vector machine (LSSVM) and a new optimization algorithm - gravitational search algorithm (GSA), by performing integrated modelling and parameter optimization, a GSA-LSSVM-based thrust estimator design solution is proposed. The results show that compared to particle swarm optimization (PSO) algorithm, GSA can find unknown optimization parameter better and enables the model developed with better prediction and generalization ability. The model can better predict aircraft engine thrust and thus fulfills the need of direct thrust control of aircraft engine.

  7. Vision based nutrient deficiency classification in maize plants using multi class support vector machines

    NASA Astrophysics Data System (ADS)

    Leena, N.; Saju, K. K.

    2018-04-01

    Nutritional deficiencies in plants are a major concern for farmers as it affects productivity and thus profit. The work aims to classify nutritional deficiencies in maize plant in a non-destructive mannerusing image processing and machine learning techniques. The colored images of the leaves are analyzed and classified with multi-class support vector machine (SVM) method. Several images of maize leaves with known deficiencies like nitrogen, phosphorous and potassium (NPK) are used to train the SVM classifier prior to the classification of test images. The results show that the method was able to classify and identify nutritional deficiencies.

  8. Research on Classification of Chinese Text Data Based on SVM

    NASA Astrophysics Data System (ADS)

    Lin, Yuan; Yu, Hongzhi; Wan, Fucheng; Xu, Tao

    2017-09-01

    Data Mining has important application value in today’s industry and academia. Text classification is a very important technology in data mining. At present, there are many mature algorithms for text classification. KNN, NB, AB, SVM, decision tree and other classification methods all show good classification performance. Support Vector Machine’ (SVM) classification method is a good classifier in machine learning research. This paper will study the classification effect based on the SVM method in the Chinese text data, and use the support vector machine method in the chinese text to achieve the classify chinese text, and to able to combination of academia and practical application.

  9. Objective research of auscultation signals in Traditional Chinese Medicine based on wavelet packet energy and support vector machine.

    PubMed

    Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei

    2010-01-01

    This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.

  10. Electrocardiographic signals and swarm-based support vector machine for hypoglycemia detection.

    PubMed

    Nuryani, Nuryani; Ling, Steve S H; Nguyen, H T

    2012-04-01

    Cardiac arrhythmia relating to hypoglycemia is suggested as a cause of death in diabetic patients. This article introduces electrocardiographic (ECG) parameters for artificially induced hypoglycemia detection. In addition, a hybrid technique of swarm-based support vector machine (SVM) is introduced for hypoglycemia detection using the ECG parameters as inputs. In this technique, a particle swarm optimization (PSO) is proposed to optimize the SVM to detect hypoglycemia. In an experiment using medical data of patients with Type 1 diabetes, the introduced ECG parameters show significant contributions to the performance of the hypoglycemia detection and the proposed detection technique performs well in terms of sensitivity and specificity.

  11. Identification of handwriting by using the genetic algorithm (GA) and support vector machine (SVM)

    NASA Astrophysics Data System (ADS)

    Zhang, Qigui; Deng, Kai

    2016-12-01

    As portable digital camera and a camera phone comes more and more popular, and equally pressing is meeting the requirements of people to shoot at any time, to identify and storage handwritten character. In this paper, genetic algorithm(GA) and support vector machine(SVM)are used for identification of handwriting. Compare with parameters-optimized method, this technique overcomes two defects: first, it's easy to trap in the local optimum; second, finding the best parameters in the larger range will affects the efficiency of classification and prediction. As the experimental results suggest, GA-SVM has a higher recognition rate.

  12. Optimization of Support Vector Machine (SVM) for Object Classification

    NASA Technical Reports Server (NTRS)

    Scholten, Matthew; Dhingra, Neil; Lu, Thomas T.; Chao, Tien-Hsin

    2012-01-01

    The Support Vector Machine (SVM) is a powerful algorithm, useful in classifying data into species. The SVMs implemented in this research were used as classifiers for the final stage in a Multistage Automatic Target Recognition (ATR) system. A single kernel SVM known as SVMlight, and a modified version known as a SVM with K-Means Clustering were used. These SVM algorithms were tested as classifiers under varying conditions. Image noise levels varied, and the orientation of the targets changed. The classifiers were then optimized to demonstrate their maximum potential as classifiers. Results demonstrate the reliability of SVM as a method for classification. From trial to trial, SVM produces consistent results.

  13. DOA Finding with Support Vector Regression Based Forward-Backward Linear Prediction.

    PubMed

    Pan, Jingjing; Wang, Yide; Le Bastard, Cédric; Wang, Tianzhen

    2017-05-27

    Direction-of-arrival (DOA) estimation has drawn considerable attention in array signal processing, particularly with coherent signals and a limited number of snapshots. Forward-backward linear prediction (FBLP) is able to directly deal with coherent signals. Support vector regression (SVR) is robust with small samples. This paper proposes the combination of the advantages of FBLP and SVR in the estimation of DOAs of coherent incoming signals with low snapshots. The performance of the proposed method is validated with numerical simulations in coherent scenarios, in terms of different angle separations, numbers of snapshots, and signal-to-noise ratios (SNRs). Simulation results show the effectiveness of the proposed method.

  14. Applications of Support Vector Machines In Chemo And Bioinformatics

    NASA Astrophysics Data System (ADS)

    Jayaraman, V. K.; Sundararajan, V.

    2010-10-01

    Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.

  15. Support vector machine based classification of fast Fourier transform spectroscopy of proteins

    NASA Astrophysics Data System (ADS)

    Lazarevic, Aleksandar; Pokrajac, Dragoljub; Marcano, Aristides; Melikechi, Noureddine

    2009-02-01

    Fast Fourier transform spectroscopy has proved to be a powerful method for study of the secondary structure of proteins since peak positions and their relative amplitude are affected by the number of hydrogen bridges that sustain this secondary structure. However, to our best knowledge, the method has not been used yet for identification of proteins within a complex matrix like a blood sample. The principal reason is the apparent similarity of protein infrared spectra with actual differences usually masked by the solvent contribution and other interactions. In this paper, we propose a novel machine learning based method that uses protein spectra for classification and identification of such proteins within a given sample. The proposed method uses principal component analysis (PCA) to identify most important linear combinations of original spectral components and then employs support vector machine (SVM) classification model applied on such identified combinations to categorize proteins into one of given groups. Our experiments have been performed on the set of four different proteins, namely: Bovine Serum Albumin, Leptin, Insulin-like Growth Factor 2 and Osteopontin. Our proposed method of applying principal component analysis along with support vector machines exhibits excellent classification accuracy when identifying proteins using their infrared spectra.

  16. A Wireless Electronic Nose System Using a Fe2O3 Gas Sensing Array and Least Squares Support Vector Regression

    PubMed Central

    Song, Kai; Wang, Qi; Liu, Qi; Zhang, Hongquan; Cheng, Yingguo

    2011-01-01

    This paper describes the design and implementation of a wireless electronic nose (WEN) system which can online detect the combustible gases methane and hydrogen (CH4/H2) and estimate their concentrations, either singly or in mixtures. The system is composed of two wireless sensor nodes—a slave node and a master node. The former comprises a Fe2O3 gas sensing array for the combustible gas detection, a digital signal processor (DSP) system for real-time sampling and processing the sensor array data and a wireless transceiver unit (WTU) by which the detection results can be transmitted to the master node connected with a computer. A type of Fe2O3 gas sensor insensitive to humidity is developed for resistance to environmental influences. A threshold-based least square support vector regression (LS-SVR)estimator is implemented on a DSP for classification and concentration measurements. Experimental results confirm that LS-SVR produces higher accuracy compared with artificial neural networks (ANNs) and a faster convergence rate than the standard support vector regression (SVR). The designed WEN system effectively achieves gas mixture analysis in a real-time process. PMID:22346587

  17. Diagnosis of Chronic Kidney Disease Based on Support Vector Machine by Feature Selection Methods.

    PubMed

    Polat, Huseyin; Danaei Mehr, Homay; Cetin, Aydin

    2017-04-01

    As Chronic Kidney Disease progresses slowly, early detection and effective treatment are the only cure to reduce the mortality rate. Machine learning techniques are gaining significance in medical diagnosis because of their classification ability with high accuracy rates. The accuracy of classification algorithms depend on the use of correct feature selection algorithms to reduce the dimension of datasets. In this study, Support Vector Machine classification algorithm was used to diagnose Chronic Kidney Disease. To diagnose the Chronic Kidney Disease, two essential types of feature selection methods namely, wrapper and filter approaches were chosen to reduce the dimension of Chronic Kidney Disease dataset. In wrapper approach, classifier subset evaluator with greedy stepwise search engine and wrapper subset evaluator with the Best First search engine were used. In filter approach, correlation feature selection subset evaluator with greedy stepwise search engine and filtered subset evaluator with the Best First search engine were used. The results showed that the Support Vector Machine classifier by using filtered subset evaluator with the Best First search engine feature selection method has higher accuracy rate (98.5%) in the diagnosis of Chronic Kidney Disease compared to other selected methods.

  18. SVM Classifier - a comprehensive java interface for support vector machine classification of microarray data.

    PubMed

    Pirooznia, Mehdi; Deng, Youping

    2006-12-12

    Graphical user interface (GUI) software promotes novelty by allowing users to extend the functionality. SVM Classifier is a cross-platform graphical application that handles very large datasets well. The purpose of this study is to create a GUI application that allows SVM users to perform SVM training, classification and prediction. The GUI provides user-friendly access to state-of-the-art SVM methods embodied in the LIBSVM implementation of Support Vector Machine. We implemented the java interface using standard swing libraries. We used a sample data from a breast cancer study for testing classification accuracy. We achieved 100% accuracy in classification among the BRCA1-BRCA2 samples with RBF kernel of SVM. We have developed a java GUI application that allows SVM users to perform SVM training, classification and prediction. We have demonstrated that support vector machines can accurately classify genes into functional categories based upon expression data from DNA microarray hybridization experiments. Among the different kernel functions that we examined, the SVM that uses a radial basis kernel function provides the best performance. The SVM Classifier is available at http://mfgn.usm.edu/ebl/svm/.

  19. A support vector regression-firefly algorithm-based model for limiting velocity prediction in sewer pipes.

    PubMed

    Ebtehaj, Isa; Bonakdari, Hossein

    2016-01-01

    Sediment transport without deposition is an essential consideration in the optimum design of sewer pipes. In this study, a novel method based on a combination of support vector regression (SVR) and the firefly algorithm (FFA) is proposed to predict the minimum velocity required to avoid sediment settling in pipe channels, which is expressed as the densimetric Froude number (Fr). The efficiency of support vector machine (SVM) models depends on the suitable selection of SVM parameters. In this particular study, FFA is used by determining these SVM parameters. The actual effective parameters on Fr calculation are generally identified by employing dimensional analysis. The different dimensionless variables along with the models are introduced. The best performance is attributed to the model that employs the sediment volumetric concentration (C(V)), ratio of relative median diameter of particles to hydraulic radius (d/R), dimensionless particle number (D(gr)) and overall sediment friction factor (λ(s)) parameters to estimate Fr. The performance of the SVR-FFA model is compared with genetic programming, artificial neural network and existing regression-based equations. The results indicate the superior performance of SVR-FFA (mean absolute percentage error = 2.123%; root mean square error =0.116) compared with other methods.

  20. Rainfall-induced Landslide Susceptibility assessment at the Longnan county

    NASA Astrophysics Data System (ADS)

    Hong, Haoyuan; Zhang, Ying

    2017-04-01

    Landslides are a serious disaster in Longnan county, China. Therefore landslide susceptibility assessment is useful tool for government or decision making. The main objective of this study is to investigate and compare the frequency ratio, support vector machines, and logistic regression. The Longnan county (Jiangxi province, China) was selected as the case study. First, the landslide inventory map with 354 landslide locations was constructed. Then landslide locations were then randomly divided into a ratio of 70/30 for the training and validating the models. Second, fourteen landslide conditioning factors were prepared such as slope, aspect, altitude, topographic wetness index (TWI), stream power index (SPI), sediment transport index (STI), plan curvature, lithology, distance to faults, distance to rivers, distance to roads, land use, normalized difference vegetation index (NDVI), and rainfall. Using the frequency ratio, support vector machines, and logistic regression, a total of three landslide susceptibility models were constructed. Finally, the overall performance of the resulting models was assessed and compared using the Receiver operating characteristic (ROC) curve technique. The result showed that the support vector machines model is the best model in the study area. The success rate is 88.39 %; and prediction rate is 84.06 %.

  1. Human action recognition with group lasso regularized-support vector machine

    NASA Astrophysics Data System (ADS)

    Luo, Huiwu; Lu, Huanzhang; Wu, Yabei; Zhao, Fei

    2016-05-01

    The bag-of-visual-words (BOVW) and Fisher kernel are two popular models in human action recognition, and support vector machine (SVM) is the most commonly used classifier for the two models. We show two kinds of group structures in the feature representation constructed by BOVW and Fisher kernel, respectively, since the structural information of feature representation can be seen as a prior for the classifier and can improve the performance of the classifier, which has been verified in several areas. However, the standard SVM employs L2-norm regularization in its learning procedure, which penalizes each variable individually and cannot express the structural information of feature representation. We replace the L2-norm regularization with group lasso regularization in standard SVM, and a group lasso regularized-support vector machine (GLRSVM) is proposed. Then, we embed the group structural information of feature representation into GLRSVM. Finally, we introduce an algorithm to solve the optimization problem of GLRSVM by alternating directions method of multipliers. The experiments evaluated on KTH, YouTube, and Hollywood2 datasets show that our method achieves promising results and improves the state-of-the-art methods on KTH and YouTube datasets.

  2. Fuzzy Nonlinear Proximal Support Vector Machine for Land Extraction Based on Remote Sensing Image

    PubMed Central

    Zhong, Xiaomei; Li, Jianping; Dou, Huacheng; Deng, Shijun; Wang, Guofei; Jiang, Yu; Wang, Yongjie; Zhou, Zebing; Wang, Li; Yan, Fei

    2013-01-01

    Currently, remote sensing technologies were widely employed in the dynamic monitoring of the land. This paper presented an algorithm named fuzzy nonlinear proximal support vector machine (FNPSVM) by basing on ETM+ remote sensing image. This algorithm is applied to extract various types of lands of the city Da’an in northern China. Two multi-category strategies, namely “one-against-one” and “one-against-rest” for this algorithm were described in detail and then compared. A fuzzy membership function was presented to reduce the effects of noises or outliers on the data samples. The approaches of feature extraction, feature selection, and several key parameter settings were also given. Numerous experiments were carried out to evaluate its performances including various accuracies (overall accuracies and kappa coefficient), stability, training speed, and classification speed. The FNPSVM classifier was compared to the other three classifiers including the maximum likelihood classifier (MLC), back propagation neural network (BPN), and the proximal support vector machine (PSVM) under different training conditions. The impacts of the selection of training samples, testing samples and features on the four classifiers were also evaluated in these experiments. PMID:23936016

  3. A Real-Time Interference Monitoring Technique for GNSS Based on a Twin Support Vector Machine Method.

    PubMed

    Li, Wutao; Huang, Zhigang; Lang, Rongling; Qin, Honglei; Zhou, Kai; Cao, Yongbin

    2016-03-04

    Interferences can severely degrade the performance of Global Navigation Satellite System (GNSS) receivers. As the first step of GNSS any anti-interference measures, interference monitoring for GNSS is extremely essential and necessary. Since interference monitoring can be considered as a classification problem, a real-time interference monitoring technique based on Twin Support Vector Machine (TWSVM) is proposed in this paper. A TWSVM model is established, and TWSVM is solved by the Least Squares Twin Support Vector Machine (LSTWSVM) algorithm. The interference monitoring indicators are analyzed to extract features from the interfered GNSS signals. The experimental results show that the chosen observations can be used as the interference monitoring indicators. The interference monitoring performance of the proposed method is verified by using GPS L1 C/A code signal and being compared with that of standard SVM. The experimental results indicate that the TWSVM-based interference monitoring is much faster than the conventional SVM. Furthermore, the training time of TWSVM is on millisecond (ms) level and the monitoring time is on microsecond (μs) level, which make the proposed approach usable in practical interference monitoring applications.

  4. Recursive feature selection with significant variables of support vectors.

    PubMed

    Tsai, Chen-An; Huang, Chien-Hsun; Chang, Ching-Wei; Chen, Chun-Houh

    2012-01-01

    The development of DNA microarray makes researchers screen thousands of genes simultaneously and it also helps determine high- and low-expression level genes in normal and disease tissues. Selecting relevant genes for cancer classification is an important issue. Most of the gene selection methods use univariate ranking criteria and arbitrarily choose a threshold to choose genes. However, the parameter setting may not be compatible to the selected classification algorithms. In this paper, we propose a new gene selection method (SVM-t) based on the use of t-statistics embedded in support vector machine. We compared the performance to two similar SVM-based methods: SVM recursive feature elimination (SVMRFE) and recursive support vector machine (RSVM). The three methods were compared based on extensive simulation experiments and analyses of two published microarray datasets. In the simulation experiments, we found that the proposed method is more robust in selecting informative genes than SVMRFE and RSVM and capable to attain good classification performance when the variations of informative and noninformative genes are different. In the analysis of two microarray datasets, the proposed method yields better performance in identifying fewer genes with good prediction accuracy, compared to SVMRFE and RSVM.

  5. T-wave end detection using neural networks and Support Vector Machines.

    PubMed

    Suárez-León, Alexander Alexeis; Varon, Carolina; Willems, Rik; Van Huffel, Sabine; Vázquez-Seisdedos, Carlos Román

    2018-05-01

    In this paper we propose a new approach for detecting the end of the T-wave in the electrocardiogram (ECG) using Neural Networks and Support Vector Machines. Both, Multilayer Perceptron (MLP) neural networks and Fixed-Size Least-Squares Support Vector Machines (FS-LSSVM) were used as regression algorithms to determine the end of the T-wave. Different strategies for selecting the training set such as random selection, k-means, robust clustering and maximum quadratic (Rényi) entropy were evaluated. Individual parameters were tuned for each method during training and the results are given for the evaluation set. A comparison between MLP and FS-LSSVM approaches was performed. Finally, a fair comparison of the FS-LSSVM method with other state-of-the-art algorithms for detecting the end of the T-wave was included. The experimental results show that FS-LSSVM approaches are more suitable as regression algorithms than MLP neural networks. Despite the small training sets used, the FS-LSSVM methods outperformed the state-of-the-art techniques. FS-LSSVM can be successfully used as a T-wave end detection algorithm in ECG even with small training set sizes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. A Temperature Compensation Method for Piezo-Resistive Pressure Sensor Utilizing Chaotic Ions Motion Algorithm Optimized Hybrid Kernel LSSVM.

    PubMed

    Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam Sm, Jahangir

    2016-10-14

    A piezo-resistive pressure sensor is made of silicon, the nature of which is considerably influenced by ambient temperature. The effect of temperature should be eliminated during the working period in expectation of linear output. To deal with this issue, an approach consists of a hybrid kernel Least Squares Support Vector Machine (LSSVM) optimized by a chaotic ions motion algorithm presented. To achieve the learning and generalization for excellent performance, a hybrid kernel function, constructed by a local kernel as Radial Basis Function (RBF) kernel, and a global kernel as polynomial kernel is incorporated into the Least Squares Support Vector Machine. The chaotic ions motion algorithm is introduced to find the best hyper-parameters of the Least Squares Support Vector Machine. The temperature data from a calibration experiment is conducted to validate the proposed method. With attention on algorithm robustness and engineering applications, the compensation result shows the proposed scheme outperforms other compared methods on several performance measures as maximum absolute relative error, minimum absolute relative error mean and variance of the averaged value on fifty runs. Furthermore, the proposed temperature compensation approach lays a foundation for more extensive research.

  7. A Real-Time Interference Monitoring Technique for GNSS Based on a Twin Support Vector Machine Method

    PubMed Central

    Li, Wutao; Huang, Zhigang; Lang, Rongling; Qin, Honglei; Zhou, Kai; Cao, Yongbin

    2016-01-01

    Interferences can severely degrade the performance of Global Navigation Satellite System (GNSS) receivers. As the first step of GNSS any anti-interference measures, interference monitoring for GNSS is extremely essential and necessary. Since interference monitoring can be considered as a classification problem, a real-time interference monitoring technique based on Twin Support Vector Machine (TWSVM) is proposed in this paper. A TWSVM model is established, and TWSVM is solved by the Least Squares Twin Support Vector Machine (LSTWSVM) algorithm. The interference monitoring indicators are analyzed to extract features from the interfered GNSS signals. The experimental results show that the chosen observations can be used as the interference monitoring indicators. The interference monitoring performance of the proposed method is verified by using GPS L1 C/A code signal and being compared with that of standard SVM. The experimental results indicate that the TWSVM-based interference monitoring is much faster than the conventional SVM. Furthermore, the training time of TWSVM is on millisecond (ms) level and the monitoring time is on microsecond (μs) level, which make the proposed approach usable in practical interference monitoring applications. PMID:26959020

  8. ATLS Hypovolemic Shock Classification by Prediction of Blood Loss in Rats Using Regression Models.

    PubMed

    Choi, Soo Beom; Choi, Joon Yul; Park, Jee Soo; Kim, Deok Won

    2016-07-01

    In our previous study, our input data set consisted of 78 rats, the blood loss in percent as a dependent variable, and 11 independent variables (heart rate, systolic blood pressure, diastolic blood pressure, mean arterial pressure, pulse pressure, respiration rate, temperature, perfusion index, lactate concentration, shock index, and new index (lactate concentration/perfusion)). The machine learning methods for multicategory classification were applied to a rat model in acute hemorrhage to predict the four Advanced Trauma Life Support (ATLS) hypovolemic shock classes for triage in our previous study. However, multicategory classification is much more difficult and complicated than binary classification. We introduce a simple approach for classifying ATLS hypovolaemic shock class by predicting blood loss in percent using support vector regression and multivariate linear regression (MLR). We also compared the performance of the classification models using absolute and relative vital signs. The accuracies of support vector regression and MLR models with relative values by predicting blood loss in percent were 88.5% and 84.6%, respectively. These were better than the best accuracy of 80.8% of the direct multicategory classification using the support vector machine one-versus-one model in our previous study for the same validation data set. Moreover, the simple MLR models with both absolute and relative values could provide possibility of the future clinical decision support system for ATLS classification. The perfusion index and new index were more appropriate with relative changes than absolute values.

  9. Decision support system for diabetic retinopathy using discrete wavelet transform.

    PubMed

    Noronha, K; Acharya, U R; Nayak, K P; Kamath, S; Bhandary, S V

    2013-03-01

    Prolonged duration of the diabetes may affect the tiny blood vessels of the retina causing diabetic retinopathy. Routine eye screening of patients with diabetes helps to detect diabetic retinopathy at the early stage. It is very laborious and time-consuming for the doctors to go through many fundus images continuously. Therefore, decision support system for diabetic retinopathy detection can reduce the burden of the ophthalmologists. In this work, we have used discrete wavelet transform and support vector machine classifier for automated detection of normal and diabetic retinopathy classes. The wavelet-based decomposition was performed up to the second level, and eight energy features were extracted. Two energy features from the approximation coefficients of two levels and six energy values from the details in three orientations (horizontal, vertical and diagonal) were evaluated. These features were fed to the support vector machine classifier with various kernel functions (linear, radial basis function, polynomial of orders 2 and 3) to evaluate the highest classification accuracy. We obtained the highest average classification accuracy, sensitivity and specificity of more than 99% with support vector machine classifier (polynomial kernel of order 3) using three discrete wavelet transform features. We have also proposed an integrated index called Diabetic Retinopathy Risk Index using clinically significant wavelet energy features to identify normal and diabetic retinopathy classes using just one number. We believe that this (Diabetic Retinopathy Risk Index) can be used as an adjunct tool by the doctors during the eye screening to cross-check their diagnosis.

  10. Extrapolation methods for vector sequences

    NASA Technical Reports Server (NTRS)

    Smith, David A.; Ford, William F.; Sidi, Avram

    1987-01-01

    This paper derives, describes, and compares five extrapolation methods for accelerating convergence of vector sequences or transforming divergent vector sequences to convergent ones. These methods are the scalar epsilon algorithm (SEA), vector epsilon algorithm (VEA), topological epsilon algorithm (TEA), minimal polynomial extrapolation (MPE), and reduced rank extrapolation (RRE). MPE and RRE are first derived and proven to give the exact solution for the right 'essential degree' k. Then, Brezinski's (1975) generalization of the Shanks-Schmidt transform is presented; the generalized form leads from systems of equations to TEA. The necessary connections are then made with SEA and VEA. The algorithms are extended to the nonlinear case by cycling, the error analysis for MPE and VEA is sketched, and the theoretical support for quadratic convergence is discussed. Strategies for practical implementation of the methods are considered.

  11. Vector Fluxgate Magnetometer (VMAG) Development for DSX

    DTIC Science & Technology

    2008-05-19

    AFRL-RV-HA-TR-2008-1108 Vector Fluxgate Magnetometer (VMAG) Development for DSX Mark B. Moldwin Q. O O O I- UCLA Q Institute of...for Public Release; Distribution Unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT UCLA is building a three-axis fluxgate magnetometer for the Air... fluxgate magnetometer provides the necessary data to support both the Space Weather (SWx) specification and mapping requirements and the WPIx

  12. Single inverted terminal repeats of the Junonia coenia Densovirus promotes somatic chromosomal integration of vector plasmids in insect cells and supports high efficiency expression

    USDA-ARS?s Scientific Manuscript database

    Plasmids that contain a disrupted genome of the Junonia coenia densovirus (JcDNV) integrate into the chromosomes of the somatic cells of insects. When subcloned individually, both the P9 inverted terminal repeat (P9-ITR) and the P93-ITR promote the chromosomal integration of vector plasmids in insec...

  13. Ability of herpes simplex virus vectors to boost immune responses to DNA vectors and to protect against challenge by simian immunodeficiency virus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaur, Amitinder; Sanford, Hannah B.; Garry, Deirdre

    2007-01-20

    The immunogenicity and protective capacity of replication-defective herpes simplex virus (HSV) vector-based vaccines were examined in rhesus macaques. Three macaques were inoculated with recombinant HSV vectors expressing Gag, Env, and a Tat-Rev-Nef fusion protein of simian immunodeficiency virus (SIV). Three other macaques were primed with recombinant DNA vectors expressing Gag, Env, and a Pol-Tat-Nef-Vif fusion protein prior to boosting with the HSV vectors. Robust anti-Gag and anti-Env cellular responses were detected in all six macaques. Following intravenous challenge with wild-type, cloned SIV239, peak and 12-week plasma viremia levels were significantly lower in vaccinated compared to control macaques. Plasma SIV RNAmore » in vaccinated macaques was inversely correlated with anti-Rev ELISPOT responses on the day of challenge (P value < 0.05), anti-Tat ELISPOT responses at 2 weeks post challenge (P value < 0.05) and peak neutralizing antibody titers pre-challenge (P value 0.06). These findings support continued study of recombinant herpesviruses as a vaccine approach for AIDS.« less

  14. Amino acid "little Big Bang": representing amino acid substitution matrices as dot products of Euclidian vectors.

    PubMed

    Zimmermann, Karel; Gibrat, Jean-François

    2010-01-04

    Sequence comparisons make use of a one-letter representation for amino acids, the necessary quantitative information being supplied by the substitution matrices. This paper deals with the problem of finding a representation that provides a comprehensive description of amino acid intrinsic properties consistent with the substitution matrices. We present a Euclidian vector representation of the amino acids, obtained by the singular value decomposition of the substitution matrices. The substitution matrix entries correspond to the dot product of amino acid vectors. We apply this vector encoding to the study of the relative importance of various amino acid physicochemical properties upon the substitution matrices. We also characterize and compare the PAM and BLOSUM series substitution matrices. This vector encoding introduces a Euclidian metric in the amino acid space, consistent with substitution matrices. Such a numerical description of the amino acid is useful when intrinsic properties of amino acids are necessary, for instance, building sequence profiles or finding consensus sequences, using machine learning algorithms such as Support Vector Machine and Neural Networks algorithms.

  15. Structural Analysis of Biodiversity

    PubMed Central

    Sirovich, Lawrence; Stoeckle, Mark Y.; Zhang, Yu

    2010-01-01

    Large, recently-available genomic databases cover a wide range of life forms, suggesting opportunity for insights into genetic structure of biodiversity. In this study we refine our recently-described technique using indicator vectors to analyze and visualize nucleotide sequences. The indicator vector approach generates correlation matrices, dubbed Klee diagrams, which represent a novel way of assembling and viewing large genomic datasets. To explore its potential utility, here we apply the improved algorithm to a collection of almost 17000 DNA barcode sequences covering 12 widely-separated animal taxa, demonstrating that indicator vectors for classification gave correct assignment in all 11000 test cases. Indicator vector analysis revealed discontinuities corresponding to species- and higher-level taxonomic divisions, suggesting an efficient approach to classification of organisms from poorly-studied groups. As compared to standard distance metrics, indicator vectors preserve diagnostic character probabilities, enable automated classification of test sequences, and generate high-information density single-page displays. These results support application of indicator vectors for comparative analysis of large nucleotide data sets and raise prospect of gaining insight into broad-scale patterns in the genetic structure of biodiversity. PMID:20195371

  16. U.S. Army Research Institute Program in Basic Research-FY 2010

    DTIC Science & Technology

    2010-11-01

    2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction ...73 Achievement in Complex Learning Environments as a Function of Information Processing Ability ...Development and Validation of a Situational Judgment Test to Predict Attrition Incrementally Over General Cognitive Ability and a Forced-Choice

  17. Religion and Education in Ireland: Growing Diversity--or Losing Faith in the System?

    ERIC Educational Resources Information Center

    Rougier, Nathalie; Honohan, Iseult

    2015-01-01

    This paper examines the evolution of the state-supported denominational education system in Ireland in the context of increasing social diversity, and considers the capacity for incremental change in a system of institutional pluralism hitherto dominated by a single religion. In particular, we examine challenges to the historical arrangements…

  18. Journalism Education in India: Quest for Professionalism or Incremental Responses

    ERIC Educational Resources Information Center

    Bharthur, Sanjay Parthasarathy

    2017-01-01

    Journalism education in India is framed in the higher education system, comprising of programs in the universities, both government-supported and media-backed private institutions, as well as in-service and short-term courses offered by press associations and other organizations. They are offered at different levels from certificate to diploma to…

  19. Diagnostic performance of smear microscopy and incremental yield of Xpert in detection of pulmonary tuberculosis in Rwanda.

    PubMed

    Ngabonziza, Jean Claude Semuto; Ssengooba, Willy; Mutua, Florence; Torrea, Gabriela; Dushime, Augustin; Gasana, Michel; Andre, Emmanuel; Uwamungu, Schifra; Nyaruhirira, Alaine Umubyeyi; Mwaengo, Dufton; Muvunyi, Claude Mambo

    2016-11-08

    Tuberculosis control program of Rwanda is currently phasing in light emitting diode-fluorescent microscopy (LED-FM) as an alternative to Ziehl-Neelsen (ZN) smear microscopy. This, alongside the newly introduced Xpert (Cepheid, Sunnyvale, CA, USA) is expected to improve diagnosis of tuberculosis and detection of rifampicin resistance in patients at health facilities. We assessed the accuracy of smear microscopy and the incremental sensitivity of Xpert at tuberculosis laboratories in Rwanda. This was a cross-sectional study involving four laboratories performing ZN and four laboratories performing LED-FM microscopy. The laboratories include four intermediate (ILs) and four peripheral (PLs) laboratories. After smear microscopy, the left-over of samples, of a single early-morning sputum from 648 participants, were tested using Xpert and mycobacterial culture as a reference standard. Sensitivity of each test was compared and the incremental sensitivity of Xpert after a negative smear was assessed. A total of 96 presumptive pulmonary tuberculosis participants were culture positive for M. tuberculosis. The overall sensitivity in PL of ZN was 55.1 % (40.2-69.3 %), LED-FM was 37 % (19.4-57.6 %) and Xpert was 77.6 % (66.6-86.4 %) whereas in ILs the same value for ZN was 58.3 % (27.7-84.8 %), LED-FM was 62.5 % (24.5-91.5 %) and Xpert was 90 (68.3-98.8 %). The sensitivity for all tests was significantly higher among HIV-negative individuals (all test p <0.05). The overall incremental sensitivity of Xpert over smear microscopy was 32.3 %; p < 0.0001. The incremental sensitivity of Xpert was statistically significant for both smear methods at PL (32.9 %; p = 0.001) but not at the ILs (30 %; p = 0.125) for both smear methods. Our study findings of the early implementation of the LED-FM did not reveal significant increment in sensitivity compared to the method being phased out (ZN). This study showed a significant incremental sensitivity for Xpert from both smear methods at peripheral centers where majority of TB patients are diagnosed. Overall our findings support the recommendation for Xpert as an initial diagnostic test in adults and children presumed to have TB.

  20. Atmospheric response to Saharan dust deduced from ECMWF reanalysis increments

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Alpert, P.; Barkan, J.; Kirchner, I.; Machenhauer, B.

    2003-04-01

    This study focuses on the atmospheric temperature response to dust deduced from a new source of data - the European Reanalysis (ERA) increments. These increments are the systematic errors of global climate models, generated in reanalysis procedure. The model errors result not only from the lack of desert dust but also from a complex combination of many kinds of model errors. Over the Sahara desert the dust radiative effect is believed to be a predominant model defect which should significantly affect the increments. This dust effect was examined by considering correlation between the increments and remotely-sensed dust. Comparisons were made between April temporal variations of the ERA analysis increments and the variations of the Total Ozone Mapping Spectrometer aerosol index (AI) between 1979 and 1993. The distinctive structure was identified in the distribution of correlation composed of three nested areas with high positive correlation (> 0.5), low correlation, and high negative correlation (<-0.5). The innermost positive correlation area (PCA) is a large area near the center of the Sahara desert. For some local maxima inside this area the correlation even exceeds 0.8. The outermost negative correlation area (NCA) is not uniform. It consists of some areas over the eastern and western parts of North Africa with a relatively small amount of dust. Inside those areas both positive and negative high correlations exist at pressure levels ranging from 850 to 700 hPa, with the peak values near 775 hPa. Dust-forced heating (cooling) inside the PCA (NCA) is accompanied by changes in the static stability of the atmosphere above the dust layer. The reanalysis data of the European Center for Medium Range Weather Forecast(ECMWF) suggests that the PCA (NCA) corresponds mainly to anticyclonic (cyclonic) flow, negative (positive) vorticity, and downward (upward) airflow. These facts indicate an interaction between dust-forced heating /cooling and atmospheric circulation. The April correlation results are supported by the analysis of vertical distribution of dust concentration, derived from the 24-hour dust prediction system at Tel Aviv University (website: http://earth.nasa.proj.ac.il/dust/current/). For other months the analysis is more complicated because of the essential increasing of humidity along with the northward progress of the ITCZ and the significant impact on the increments.

  1. Efficient boundary hunting via vector quantization

    NASA Astrophysics Data System (ADS)

    Diamantini, Claudia; Panti, Maurizio

    2001-03-01

    A great amount of information about a classification problem is contained in those instances falling near the decision boundary. This intuition dates back to the earliest studies in pattern recognition, and in the more recent adaptive approaches to the so called boundary hunting, such as the work of Aha et alii on Instance Based Learning and the work of Vapnik et alii on Support Vector Machines. The last work is of particular interest, since theoretical and experimental results ensure the accuracy of boundary reconstruction. However, its optimization approach has heavy computational and memory requirements, which limits its application on huge amounts of data. In the paper we describe an alternative approach to boundary hunting based on adaptive labeled quantization architectures. The adaptation is performed by a stochastic gradient algorithm for the minimization of the error probability. Error probability minimization guarantees the accurate approximation of the optimal decision boundary, while the use of a stochastic gradient algorithm defines an efficient method to reach such approximation. In the paper comparisons to Support Vector Machines are considered.

  2. Automatic sleep staging using multi-dimensional feature extraction and multi-kernel fuzzy support vector machine.

    PubMed

    Zhang, Yanjun; Zhang, Xiangmin; Liu, Wenhui; Luo, Yuxi; Yu, Enjia; Zou, Keju; Liu, Xiaoliang

    2014-01-01

    This paper employed the clinical Polysomnographic (PSG) data, mainly including all-night Electroencephalogram (EEG), Electrooculogram (EOG) and Electromyogram (EMG) signals of subjects, and adopted the American Academy of Sleep Medicine (AASM) clinical staging manual as standards to realize automatic sleep staging. Authors extracted eighteen different features of EEG, EOG and EMG in time domains and frequency domains to construct the vectors according to the existing literatures as well as clinical experience. By adopting sleep samples self-learning, the linear combination of weights and parameters of multiple kernels of the fuzzy support vector machine (FSVM) were learned and the multi-kernel FSVM (MK-FSVM) was constructed. The overall agreement between the experts' scores and the results presented was 82.53%. Compared with previous results, the accuracy of N1 was improved to some extent while the accuracies of other stages were approximate, which well reflected the sleep structure. The staging algorithm proposed in this paper is transparent, and worth further investigation.

  3. Development and Validation of a Multidimensional Measure of Family Supportive Supervisor Behaviors (FSSB)

    PubMed Central

    Hammer, Leslie B.; Kossek, Ellen Ernst; Yragui, Nanette L.; Bodner, Todd E.; Hanson, Ginger C.

    2011-01-01

    Due to growing work-family demands, supervisors need to effectively exhibit family supportive supervisor behaviors (FSSB). Drawing on social support theory and using data from two samples of lower wage workers, the authors develop and validate a measure of FSSB, defined as behaviors exhibited by supervisors that are supportive of families. FSSB is conceptualized as a multidimensional superordinate construct with four subordinate dimensions: emotional support, instrumental support, role modeling behaviors, and creative work-family management. Results from multilevel confirmatory factor analyses and multilevel regression analyses provide evidence of construct, criterion-related, and incremental validity. The authors found FSSB to be significantly related to work-family conflict, work-family positive spillover, job satisfaction, and turnover intentions over and above measures of general supervisor support. PMID:21660254

  4. Circulating levels of C-reactive protein, interleukin-6 and tumor necrosis factor-α and risk of colorectal adenomas: a meta-analysis.

    PubMed

    Zhang, Xiaoqian; Liu, Shanglong; Zhou, Yanbing

    2016-09-27

    Results from publications on inflammatory markers of C-reactive protein (CRP), interleukin-6 (IL-6) and tumor necrosis factor-α (TNF-α) and risk of colorectal adenomas are not consistent. A meta-analysis was conducted to explore the above-mentioned associations. Relevant studies were identified by a search of Embase, Medline and PubMed through February 2016. A random effect model was adopted to combine study-specific odds ratio (OR) and 95% confidence interval (95% CI). Between-study heterogeneity and publications bias were assessed. Dose-response relationships were assessed by restricted cubic splines. Nineteen observational studies were included. For highest vs. lowest levels, results from this meta-analysis did not support an association between circulating levels of CRP [OR (95% CI): 1.15 (0.94-1.40)], IL-6 [1.17 (0.94-1.46)] and TNF-α [0.99 (0.75-1.31)] and risk of colorectal adenomas, respectively. The findings were supported by sensitivity analysis and subgroup analysis. In dose-response analysis, the risk of colorectal adenomas increased by 2% [1.02 (0.97-1.08)] for each 1 mg/L increment in circulation CRP levels, 9% [1.09 (0.91-1.31)] for each 1 ng/L increment in circulation IL-6 levels, and 6% [1.06 (0.93-1.21)] for each 1 pg/mL increment in circulation TNF-α levels. Moderate between-study heterogeneity was found. No evidence of publication bias was found. Circulation levels of CRP, IL-6 and TNF-α might be not useful biomarkers for identifying colorectal adenomas, respectively.

  5. On a concurrent element-by-element preconditioned conjugate gradient algorithm for multiple load cases

    NASA Technical Reports Server (NTRS)

    Watson, Brian; Kamat, M. P.

    1990-01-01

    Element-by-element preconditioned conjugate gradient (EBE-PCG) algorithms have been advocated for use in parallel/vector processing environments as being superior to the conventional LDL(exp T) decomposition algorithm for single load cases. Although there may be some advantages in using such algorithms for a single load case, when it comes to situations involving multiple load cases, the LDL(exp T) decomposition algorithm would appear to be decidedly more cost-effective. The authors have outlined an EBE-PCG algorithm suitable for multiple load cases and compared its effectiveness to the highly efficient LDL(exp T) decomposition scheme. The proposed algorithm offers almost no advantages over the LDL(exp T) algorithm for the linear problems investigated on the Alliant FX/8. However, there may be some merit in the algorithm in solving nonlinear problems with load incrementation, but that remains to be investigated.

  6. The design of a wind tunnel VSTOL fighter model incorporating turbine powered engine simulators

    NASA Technical Reports Server (NTRS)

    Bailey, R. O.; Maraz, M. R.; Hiley, P. E.

    1981-01-01

    A wind-tunnel model of a supersonic VSTOL fighter aircraft configuration has been developed for use in the evaluation of airframe-propulsion system aerodynamic interactions. The model may be employed with conventional test techniques, where configuration aerodynamics are measured in a flow-through mode and incremental nozzle-airframe interactions are measured in a jet-effects mode, and with the Compact Multimission Aircraft Propulsion Simulator which is capable of the simultaneous simulation of inlet and exhaust nozzle flow fields so as to allow the evaluation of the extent of inlet and nozzle flow field coupling. The basic configuration of the twin-engine model has a geometrically close-coupled canard and wing, and a moderately short nacelle with nonaxisymmetric vectorable exhaust nozzles near the wing trailing edge, and may be converted to a canardless configuration with an extremely short nacelle. Testing is planned to begin in the summer of 1982.

  7. Life tables and reproductive parameters of Lutzomyia spinicrassa (Diptera: Psychodidae) under laboratory conditions.

    PubMed

    Escovar, Jesús; Bello, Felio J; Morales, Alberto; Moncada, Ligia; Cárdenas, Estrella

    2004-10-01

    Lutzomyia spinicrassa is a vector of Leishmania braziliensis in Colombia. This sand fly has a broad geographical distribution in Colombia and Venezuela and it is found mainly in coffee plantations. Baseline biological growth data of L. spinicrassa were obtained under experimental laboratory conditions. The development time from egg to adult ranged from 59 to 121 days, with 12.74 weeks in average. Based on cohorts of 100 females, horizontal life table was constructed. The following predictive parameters were obtained: net rate of reproduction (8.4 females per cohort female), generation time (12.74 weeks), intrinsic rate of population increase (0.17), and finite rate of population increment (1.18). The reproductive value for each class age of the cohort females was calculated. Vertical life tables were elaborated and mortality was described for the generation obtained of the field cohort. In addition, for two successive generations, additive variance and heritability for fecundity were estimated.

  8. Color discrimination across four life decades assessed by the Cambridge Colour Test.

    PubMed

    Paramei, Galina V

    2012-02-01

    Color discrimination was estimated using the Cambridge Colour Test (CCT) in 160 normal trichromats of four life decades, 20-59 years of age. For each age cohort, medians and tolerance limits of the CCT parameters are tabulated. Compared across the age cohorts (Kruskal-Wallis test), the Trivector test showed increases in the three vectors, Protan, Deutan, and Tritan, with advancing age; the Ellipses test revealed significant elongation of the major axes of all three ellipses but no changes in either the axis ratio or the angle of the ellipse major axis. Multiple comparisons (Mann-Whitney test) between the cohorts of four age decades (20+,…,50+) revealed initial benign deterioration of color discrimination in the 40+ decade, as an incremental loss of discrimination along the Deutan axis (Trivector test), and in the 50+ decade, as an elongation of the major axes of all three ellipses (Ellipses test). © 2012 Optical Society of America

  9. Electric field observations of equatorial bubbles

    NASA Technical Reports Server (NTRS)

    Aggson, T. L.; Maynard, N. C.; Hanson, W. B.; Saba, Jack L.

    1992-01-01

    Results from the double floating probe experiment performed on the San Marco D satellite are presented, with emphasis on the observation of large incremental changes in the convective electric field vector at the boundary of equatorial plasma bubbles. Attention is given to isolated bubble structures in the upper ionospheric F regions; these observed bubble encounters are divided into two types - type I (live bubbles) and type II (dead bubbles). Type I bubbles show varying degrees of plasma depletion and large upward velocities range up to 1000 km/s. The geometry of these bubbles is such that the spacecraft orbit may cut them where they are tilting either eastward or (more often) westward. Type II bubbles exhibit plasma density depletion but no appreciable upward convection. Both types of events are usually surrounded by a halo of plasma turbulence, which can extend considerably beyond the region of plasma depletion.

  10. Memory persistency and nonlinearity in daily mean dew point across India

    NASA Astrophysics Data System (ADS)

    Ray, Rajdeep; Khondekar, Mofazzal Hossain; Ghosh, Koushik; Bhattacharjee, Anup Kumar

    2016-04-01

    Enterprising endeavour has been taken in this work to realize and estimate the persistence in memory of the daily mean dew point time series obtained from seven different weather stations viz. Kolkata, Chennai (Madras), New Delhi, Mumbai (Bombay), Bhopal, Agartala and Ahmedabad representing different geographical zones in India. Hurst exponent values reveal an anti-persistent behaviour of these dew point series. To affirm the Hurst exponent values, five different scaling methods have been used and the corresponding results are compared to synthesize a finer and reliable conclusion out of it. The present analysis also bespeaks that the variation in daily mean dew point is governed by a non-stationary process with stationary increments. The delay vector variance (DVV) method has been exploited to investigate nonlinearity, and the present calculation confirms the presence of deterministic nonlinear profile in the daily mean dew point time series of the seven stations.

  11. Atrial fibrillation detection by heart rate variability in Poincare plot.

    PubMed

    Park, Jinho; Lee, Sangwook; Jeon, Moongu

    2009-12-11

    Atrial fibrillation (AFib) is one of the prominent causes of stroke, and its risk increases with age. We need to detect AFib correctly as early as possible to avoid medical disaster because it is likely to proceed into a more serious form in short time. If we can make a portable AFib monitoring system, it will be helpful to many old people because we cannot predict when a patient will have a spasm of AFib. We analyzed heart beat variability from inter-beat intervals obtained by a wavelet-based detector. We made a Poincare plot using the inter-beat intervals. By analyzing the plot, we extracted three feature measures characterizing AFib and non-AFib: the number of clusters, mean stepping increment of inter-beat intervals, and dispersion of the points around a diagonal line in the plot. We divided distribution of the number of clusters into two and calculated mean value of the lower part by k-means clustering method. We classified data whose number of clusters is more than one and less than this mean value as non-AFib data. In the other case, we tried to discriminate AFib from non-AFib using support vector machine with the other feature measures: the mean stepping increment and dispersion of the points in the Poincare plot. We found that Poincare plot from non-AFib data showed some pattern, while the plot from AFib data showed irregularly irregular shape. In case of non-AFib data, the definite pattern in the plot manifested itself with some limited number of clusters or closely packed one cluster. In case of AFib data, the number of clusters in the plot was one or too many. We evaluated the accuracy using leave-one-out cross-validation. Mean sensitivity and mean specificity were 91.4% and 92.9% respectively. Because pulse beats of ventricles are less likely to be influenced by baseline wandering and noise, we used the inter-beat intervals to diagnose AFib. We visually displayed regularity of the inter-beat intervals by way of Poincare plot. We tried to design an automated algorithm which did not require any human intervention and any specific threshold, and could be installed in a portable AFib monitoring system.

  12. Incremental Refinement of FAÇADE Models with Attribute Grammar from 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Dehbi, Y.; Staat, C.; Mandtler, L.; Pl¨umer, L.

    2016-06-01

    Data acquisition using unmanned aerial vehicles (UAVs) has gotten more and more attention over the last years. Especially in the field of building reconstruction the incremental interpretation of such data is a demanding task. In this context formal grammars play an important role for the top-down identification and reconstruction of building objects. Up to now, the available approaches expect offline data in order to parse an a-priori known grammar. For mapping on demand an on the fly reconstruction based on UAV data is required. An incremental interpretation of the data stream is inevitable. This paper presents an incremental parser of grammar rules for an automatic 3D building reconstruction. The parser enables a model refinement based on new observations with respect to a weighted attribute context-free grammar (WACFG). The falsification or rejection of hypotheses is supported as well. The parser can deal with and adapt available parse trees acquired from previous interpretations or predictions. Parse trees derived so far are updated in an iterative way using transformation rules. A diagnostic step searches for mismatches between current and new nodes. Prior knowledge on façades is incorporated. It is given by probability densities as well as architectural patterns. Since we cannot always assume normal distributions, the derivation of location and shape parameters of building objects is based on a kernel density estimation (KDE). While the level of detail is continuously improved, the geometrical, semantic and topological consistency is ensured.

  13. Health level seven interoperability strategy: big data, incrementally structured.

    PubMed

    Dolin, R H; Rogers, B; Jaffe, C

    2015-01-01

    Describe how the HL7 Clinical Document Architecture (CDA), a foundational standard in US Meaningful Use, contributes to a "big data, incrementally structured" interoperability strategy, whereby data structured incrementally gets large amounts of data flowing faster. We present cases showing how this approach is leveraged for big data analysis. To support the assertion that semi-structured narrative in CDA format can be a useful adjunct in an overall big data analytic approach, we present two case studies. The first assesses an organization's ability to generate clinical quality reports using coded data alone vs. coded data supplemented by CDA narrative. The second leverages CDA to construct a network model for referral management, from which additional observations can be gleaned. The first case shows that coded data supplemented by CDA narrative resulted in significant variances in calculated performance scores. In the second case, we found that the constructed network model enables the identification of differences in patient characteristics among different referral work flows. The CDA approach goes after data indirectly, by focusing first on the flow of narrative, which is then incrementally structured. A quantitative assessment of whether this approach will lead to a greater flow of data and ultimately a greater flow of structured data vs. other approaches is planned as a future exercise. Along with growing adoption of CDA, we are now seeing the big data community explore the standard, particularly given its potential to supply analytic en- gines with volumes of data previously not possible.

  14. Identification of platelet refractoriness in oncohematologic patients

    PubMed Central

    Ferreira, Aline Aparecida; Zulli, Roberto; Soares, Sheila; de Castro, Vagner; Moraes-Souza, Helio

    2011-01-01

    OBJECTIVES: To identify the occurrence and the causes of platelet refractoriness in oncohematologic patients. INTRODUCTION: Platelet refractoriness (unsatisfactory post-transfusion platelet increment) is a severe problem that impairs the treatment of oncohematologic patients and is not routinely investigated in most Brazilian services. METHODS: Forty-four episodes of platelet concentrate transfusion were evaluated in 16 patients according to the following parameters: corrected count increment, clinical conditions and detection of anti-platelet antibodies by the platelet immunofluorescence test (PIFT) and panel reactive antibodies against human leukocyte antigen class I (PRA-HLA). RESULTS: Of the 16 patients evaluated (median age: 53 years), nine (56%) were women, seven of them with a history of pregnancy. An unsatisfactory increment was observed in 43% of the transfusion events, being more frequent in transfusions of random platelet concentrates (54%). Platelet refractoriness was confirmed in three patients (19%), who presented immunologic and non-immunologic causes. Alloantibodies were identified in eight patients (50%) by the PIFT and in three (19%) by the PRA-HLA. Among alloimmunized patients, nine (64%) had a history of transfusion, and three as a result of pregnancy (43%). Of the former, two were refractory (29%). No significant differences were observed, probably as a result of the small sample size. CONCLUSION: The high rate of unsatisfactory platelet increment, refractoriness and alloimmunization observed support the need to set up protocols for the investigation of this complication in all chronically transfused patients, a fundamental requirement for the guarantee of adequate management. PMID:21437433

  15. Growth response and sapwood hydraulic properties of young lodgepole pine following repeated fertilization.

    PubMed

    Amponsah, Isaac G; Lieffers, Victor J; Comeau, Philip G; Brockley, Robert P

    2004-10-01

    We examined how tree growth and hydraulic properties of branches and boles are influenced by periodic (about 6 years) and annual fertilization in two juvenile lodgepole pine (Pinus contorta Dougl. var. latifolia Engelm.) stands in the interior of British Columbia, Canada. Mean basal area (BA), diameter at breast height (DBH) and height increments and percent earlywood and sapwood hydraulic parameters of branches and boles were measured 7 or 8 years after the initial treatments at Sheridan Creek and Kenneth Creek. At Sheridan Creek, fertilization significantly increased BA and DBH increments, but had no effect on height increment. At Kenneth Creek, fertilization increased BA, but fertilized trees had significantly lower height increments than control trees. Sapwood permeability was greater in lower branches of repeatedly fertilized trees than in those of control trees. Sapwood permeabilities of the lower branches of trees in the control, periodic and annual treatments were 0.24 x 10(-12), 0.35 x 10(-12) and 0.45 x 10(-12) m2 at Kenneth Creek; and 0.41 x 10(-12), 0.54 x 10(-12) and 0.65 x 10(-12) m2 at Sheridan Creek, respectively. Annual fertilization tended to increase leaf specific conductivities and Huber values of the lower branches of trees at both study sites. We conclude that, in trees fertilized annually, the higher flow capacity of lower branches may reduce the availability of water to support annual growth of the leader and upper branches.

  16. Earth Observation and Indicators Pertaining to Determinants of Health- An Approach to Support Local Scale Characterization of Environmental Determinants of Vector-Borne Diseases

    NASA Astrophysics Data System (ADS)

    Kotchi, Serge Olivier; Brazeau, Stephanie; Ludwig, Antoinette; Aube, Guy; Berthiaume, Pilippe

    2016-08-01

    Environmental determinants (EVDs) were identified as key determinant of health (DoH) for the emergence and re-emergence of several vector-borne diseases. Maintaining ongoing acquisition of data related to EVDs at local scale and for large regions constitutes a significant challenge. Earth observation (EO) satellites offer a framework to overcome this challenge. However, EO image analysis methods commonly used to estimate EVDs are time and resource consuming. Moreover, variations of microclimatic conditions combined with high landscape heterogeneity limit the effectiveness of climatic variables derived from EO. In this study, we present what are DoH and EVDs, the impacts of EVDs on vector-borne diseases in the context of global environmental change, the need to characterize EVDs of vector-borne diseases at local scale and its challenges, and finally we propose an approach based on EO images to estimate at local scale indicators pertaining to EVDs of vector-borne diseases.

  17. Design of a muscle cell-specific expression vector utilising human vascular smooth muscle alpha-actin regulatory elements.

    PubMed

    Keogh, M C; Chen, D; Schmitt, J F; Dennehy, U; Kakkar, V V; Lemoine, N R

    1999-04-01

    The facility to direct tissue-specific expression of therapeutic gene constructs is desirable for many gene therapy applications. We describe the creation of a muscle-selective expression vector which supports transcription in vascular smooth muscle, cardiac muscle and skeletal muscle, while it is essentially silent in other cell types such as endothelial cells, hepatocytes and fibroblasts. Specific transcriptional regulatory elements have been identified in the human vascular smooth muscle cell (VSMC) alpha-actin gene, and used to create an expression vector which directs the expression of genes in cis to muscle cells. The vector contains an enhancer element we have identified in the 5' flanking region of the human VSMC alpha-actin gene involved in mediating VSMC expression. Heterologous pairing experiments have shown that the enhancer does not interact with the basal transcription complex recruited at the minimal SV40 early promoter. Such a vector has direct application in the modulation of VSMC proliferation associated with intimal hyperplasia/restenosis.

  18. Vector Blood Meals Are an Early Indicator of the Effectiveness of the Ecohealth Approach in Halting Chagas Transmission in Guatemala

    PubMed Central

    Pellecer, Mariele J.; Dorn, Patricia L.; Bustamante, Dulce M.; Rodas, Antonieta; Monroy, M. Carlota

    2013-01-01

    A novel method using vector blood meal sources to assess the impact of control efforts on the risk of transmission of Chagas disease was tested in the village of El Tule, Jutiapa, Guatemala. Control used Ecohealth interventions, where villagers ameliorated the factors identified as most important for transmission. First, after an initial insecticide application, house walls were plastered. Later, bedroom floors were improved and domestic animals were moved outdoors. Only vector blood meal sources revealed the success of the first interventions: human blood meals declined from 38% to 3% after insecticide application and wall plastering. Following all interventions both vector blood meal sources and entomological indices revealed the reduction in transmission risk. These results indicate that vector blood meals may reveal effects of control efforts early on, effects that may not be apparent using traditional entomological indices, and provide further support for the Ecohealth approach to Chagas control in Guatemala. PMID:23382165

  19. Quantum optimization for training support vector machines.

    PubMed

    Anguita, Davide; Ridella, Sandro; Rivieccio, Fabio; Zunino, Rodolfo

    2003-01-01

    Refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors, represent recent and promising approaches to characterize the generalization ability of Support Vector Machines (SVMs). The advantages of those techniques lie in both improving the SVM representation ability and yielding tighter generalization bounds. On the other hand, they often make Quadratic-Programming algorithms no longer applicable, and SVM training cannot benefit from efficient, specialized optimization techniques. The paper considers the application of Quantum Computing to solve the problem of effective SVM training, especially in the case of digital implementations. The presented research compares the behavioral aspects of conventional and enhanced SVMs; experiments in both a synthetic and real-world problems support the theoretical analysis. At the same time, the related differences between Quadratic-Programming and Quantum-based optimization techniques are considered.

  20. Arbitrary norm support vector machines.

    PubMed

    Huang, Kaizhu; Zheng, Danian; King, Irwin; Lyu, Michael R

    2009-02-01

    Support vector machines (SVM) are state-of-the-art classifiers. Typically L2-norm or L1-norm is adopted as a regularization term in SVMs, while other norm-based SVMs, for example, the L0-norm SVM or even the L(infinity)-norm SVM, are rarely seen in the literature. The major reason is that L0-norm describes a discontinuous and nonconvex term, leading to a combinatorially NP-hard optimization problem. In this letter, motivated by Bayesian learning, we propose a novel framework that can implement arbitrary norm-based SVMs in polynomial time. One significant feature of this framework is that only a sequence of sequential minimal optimization problems needs to be solved, thus making it practical in many real applications. The proposed framework is important in the sense that Bayesian priors can be efficiently plugged into most learning methods without knowing the explicit form. Hence, this builds a connection between Bayesian learning and the kernel machines. We derive the theoretical framework, demonstrate how our approach works on the L0-norm SVM as a typical example, and perform a series of experiments to validate its advantages. Experimental results on nine benchmark data sets are very encouraging. The implemented L0-norm is competitive with or even better than the standard L2-norm SVM in terms of accuracy but with a reduced number of support vectors, -9.46% of the number on average. When compared with another sparse model, the relevance vector machine, our proposed algorithm also demonstrates better sparse properties with a training speed over seven times faster.

  1. Condition Assessment of Foundation Piles and Utility Poles Based on Guided Wave Propagation Using a Network of Tactile Transducers and Support Vector Machines

    PubMed Central

    Yu, Yang; Niederleithinger, Ernst; Li, Jianchun; Wiggenhauser, Herbert

    2017-01-01

    This paper presents a novel non-destructive testing and health monitoring system using a network of tactile transducers and accelerometers for the condition assessment and damage classification of foundation piles and utility poles. While in traditional pile integrity testing an impact hammer with broadband frequency excitation is typically used, the proposed testing system utilizes an innovative excitation system based on a network of tactile transducers to induce controlled narrow-band frequency stress waves. Thereby, the simultaneous excitation of multiple stress wave types and modes is avoided (or at least reduced), and targeted wave forms can be generated. The new testing system enables the testing and monitoring of foundation piles and utility poles where the top is inaccessible, making the new testing system suitable, for example, for the condition assessment of pile structures with obstructed heads and of poles with live wires. For system validation, the new system was experimentally tested on nine timber and concrete poles that were inflicted with several types of damage. The tactile transducers were excited with continuous sine wave signals of 1 kHz frequency. Support vector machines were employed together with advanced signal processing algorithms to distinguish recorded stress wave signals from pole structures with different types of damage. The results show that using fast Fourier transform signals, combined with principal component analysis as the input feature vector for support vector machine (SVM) classifiers with different kernel functions, can achieve damage classification with accuracies of 92.5% ± 7.5%. PMID:29258274

  2. Helicity statistics in homogeneous and isotropic turbulence and turbulence models

    NASA Astrophysics Data System (ADS)

    Sahoo, Ganapati; De Pietro, Massimo; Biferale, Luca

    2017-02-01

    We study the statistical properties of helicity in direct numerical simulations of fully developed homogeneous and isotropic turbulence and in a class of turbulence shell models. We consider correlation functions based on combinations of vorticity and velocity increments that are not invariant under mirror symmetry. We also study the scaling properties of high-order structure functions based on the moments of the velocity increments projected on a subset of modes with either positive or negative helicity (chirality). We show that mirror symmetry is recovered at small scales, i.e., chiral terms are subleading and they are well captured by a dimensional argument plus anomalous corrections. These findings are also supported by a high Reynolds numbers study of helical shell models with the same chiral symmetry of Navier-Stokes equations.

  3. The Major Antigenic Membrane Protein of “Candidatus Phytoplasma asteris” Selectively Interacts with ATP Synthase and Actin of Leafhopper Vectors

    PubMed Central

    Galetto, Luciana; Bosco, Domenico; Balestrini, Raffaella; Genre, Andrea; Fletcher, Jacqueline; Marzachì, Cristina

    2011-01-01

    Phytoplasmas, uncultivable phloem-limited phytopathogenic wall-less bacteria, represent a major threat to agriculture worldwide. They are transmitted in a persistent, propagative manner by phloem-sucking Hemipteran insects. Phytoplasma membrane proteins are in direct contact with hosts and are presumably involved in determining vector specificity. Such a role has been proposed for phytoplasma transmembrane proteins encoded by circular extrachromosomal elements, at least one of which is a plasmid. Little is known about the interactions between major phytoplasma antigenic membrane protein (Amp) and insect vector proteins. The aims of our work were to identify vector proteins interacting with Amp and to investigate their role in transmission specificity. In controlled transmission experiments, four Hemipteran species were identified as vectors of “Candidatus Phytoplasma asteris”, the chrysanthemum yellows phytoplasmas (CYP) strain, and three others as non-vectors. Interactions between a labelled (recombinant) CYP Amp and insect proteins were analysed by far Western blots and affinity chromatography. Amp interacted specifically with a few proteins from vector species only. Among Amp-binding vector proteins, actin and both the α and β subunits of ATP synthase were identified by mass spectrometry and Western blots. Immunofluorescence confocal microscopy and Western blots of plasma membrane and mitochondrial fractions confirmed the localisation of ATP synthase, generally known as a mitochondrial protein, in plasma membranes of midgut and salivary gland cells in the vector Euscelidius variegatus. The vector-specific interaction between phytoplasma Amp and insect ATP synthase is demonstrated for the first time, and this work also supports the hypothesis that host actin is involved in the internalization and intracellular motility of phytoplasmas within their vectors. Phytoplasma Amp is hypothesized to play a crucial role in insect transmission specificity. PMID:21799902

  4. What is the Risk for Exposure to Vector-Borne Pathogens in United States National Parks?

    PubMed Central

    EISEN, LARS; WONG, DAVID; SHELUS, VICTORIA; EISEN, REBECCA J.

    2015-01-01

    United States national parks attract >275 million visitors annually and collectively present risk of exposure for staff and visitors to a wide range of arthropod vector species (most notably fleas, mosquitoes, and ticks) and their associated bacterial, protozoan, or viral pathogens. We assessed the current state of knowledge for risk of exposure to vector-borne pathogens in national parks through a review of relevant literature, including internal National Park Service documents and organismal databases. We conclude that, because of lack of systematic surveillance for vector-borne pathogens in national parks, the risk of pathogen exposure for staff and visitors is unclear. Existing data for vectors within national parks were not based on systematic collections and rarely include evaluation for pathogen infection. Extrapolation of human-based surveillance data from neighboring communities likely provides inaccurate estimates for national parks because landscape differences impact transmission of vector-borne pathogens and human-vector contact rates likely differ inside versus outside the parks because of differences in activities or behaviors. Vector-based pathogen surveillance holds promise to define when and where within national parks the risk of exposure to infected vectors is elevated. A pilot effort, including 5–10 strategic national parks, would greatly improve our understanding of the scope and magnitude of vector-borne pathogen transmission in these high-use public settings. Such efforts also will support messaging to promote personal protection measures and inform park visitors and staff of their responsibility for personal protection, which the National Park Service preservation mission dictates as the core strategy to reduce exposure to vector-borne pathogens in national parks. PMID:23540107

  5. Spacebased Estimation of Moisture Transport in Marine Atmosphere Using Support Vector Regression

    NASA Technical Reports Server (NTRS)

    Xie, Xiaosu; Liu, W. Timothy; Tang, Benyang

    2007-01-01

    An improved algorithm is developed based on support vector regression (SVR) to estimate horizonal water vapor transport integrated through the depth of the atmosphere ((Theta)) over the global ocean from observations of surface wind-stress vector by QuikSCAT, cloud drift wind vector derived from the Multi-angle Imaging SpectroRadiometer (MISR) and geostationary satellites, and precipitable water from the Special Sensor Microwave/Imager (SSM/I). The statistical relation is established between the input parameters (the surface wind stress, the 850 mb wind, the precipitable water, time and location) and the target data ((Theta) calculated from rawinsondes and reanalysis of numerical weather prediction model). The results are validated with independent daily rawinsonde observations, monthly mean reanalysis data, and through regional water balance. This study clearly demonstrates the improvement of (Theta) derived from satellite data using SVR over previous data sets based on linear regression and neural network. The SVR methodology reduces both mean bias and standard deviation comparedwith rawinsonde observations. It agrees better with observations from synoptic to seasonal time scales, and compare more favorably with the reanalysis data on seasonal variations. Only the SVR result can achieve the water balance over South America. The rationale of the advantage by SVR method and the impact of adding the upper level wind will also be discussed.

  6. Importance of murine study design for testing toxicity of retroviral vectors in support of phase I trials.

    PubMed

    Will, Elke; Bailey, Jeff; Schuesler, Todd; Modlich, Ute; Balcik, Brenden; Burzynski, Ben; Witte, David; Layh-Schmitt, Gerlinde; Rudolph, Cornelia; Schlegelberger, Brigitte; von Kalle, Christof; Baum, Christopher; Sorrentino, Brian P; Wagner, Lars M; Kelly, Patrick; Reeves, Lilith; Williams, David A

    2007-04-01

    Although retroviral vectors are one of the most widely used vehicles for gene transfer, there is no uniformly accepted pre-clinical model defined to assess their safety, in particular their risk related to insertional mutagenesis. In the murine pre-clinical study presented here, 40 test and 10 control mice were transplanted with ex vivo manipulated bone marrow cells to assess the long-term effects of the transduction of hematopoietic cells with the retroviral vector MSCV-MGMT(P140K)wc. Test mice had significant gene marking 8-12 months post-transplantation with an average of 0.93 vector copies per cell and 41.5% of peripheral blood cells expressing the transgene MGMT(P140K), thus confirming persistent vector expression. Unexpectedly, six test mice developed malignant lymphoma. No vector was detected in the tumor cells of five animals with malignancies, indicating that the malignancies were not caused by insertional mutagenesis or MGMT(P140K) expression. Mice from a concurrent study with a different transgene also revealed additional cases of vector-negative lymphomas of host origin. We conclude that the background tumor formation in this mouse model complicates safety determination of retroviral vectors and propose an improved study design that we predict will increase the relevance and accuracy of interpretation of pre-clinical mouse studies.

  7. How a haemosporidian parasite of bats gets around: the genetic structure of a parasite, vector and host compared.

    PubMed

    Witsenburg, F; Clément, L; López-Baucells, A; Palmeirim, J; Pavlinić, I; Scaravelli, D; Ševčík, M; Dutoit, L; Salamin, N; Goudet, J; Christe, P

    2015-02-01

    Parasite population structure is often thought to be largely shaped by that of its host. In the case of a parasite with a complex life cycle, two host species, each with their own patterns of demography and migration, spread the parasite. However, the population structure of the parasite is predicted to resemble only that of the most vagile host species. In this study, we tested this prediction in the context of a vector-transmitted parasite. We sampled the haemosporidian parasite Polychromophilus melanipherus across its European range, together with its bat fly vector Nycteribia schmidlii and its host, the bent-winged bat Miniopterus schreibersii. Based on microsatellite analyses, the wingless vector, and not the bat host, was identified as the least structured population and should therefore be considered the most vagile host. Genetic distance matrices were compared for all three species based on a mitochondrial DNA fragment. Both host and vector populations followed an isolation-by-distance pattern across the Mediterranean, but not the parasite. Mantel tests found no correlation between the parasite and either the host or vector populations. We therefore found no support for our hypothesis; the parasite population structure matched neither vector nor host. Instead, we propose a model where the parasite's gene flow is represented by the added effects of host and vector dispersal patterns. © 2015 John Wiley & Sons Ltd.

  8. The Joint Space Operations Center (JSpOC) Mission System (JMS) and the Advanced Research, Collaboration, and Application Development Environment (ARCADE)

    NASA Astrophysics Data System (ADS)

    Johnson, K.; Kim, R.; Echeverry, J.

    The Joint Space Operations Center (JSpOC) is a command and control center focused on executing the Space Control mission of the Joint Functional Component Command for Space (JFCC-SPACE) to ensure freedom of action of United States (US) space assets, while preventing adversary use of space against the US. To accomplish this, the JSpOC tasks a network of space surveillance sensors to collect Space Situational Awareness (SSA) data on resident space objects (RSOs) in near earth and deep space orbits. SSA involves the ingestion of data sources and use of algorithms and tools to build, maintain, and disseminate situational awareness of RSOs in space. On the heels of emergent and complex threats to space assets, the JSpOC's capabilities are limited by legacy systems and CONOPs. The JSpOC Mission System (JMS) aims to consolidate SSA efforts across US agencies, international partners, and commercial partners. The JMS program is intended to deliver a modern service-oriented architecture (SOA) based infrastructure with increased process automation and improved tools to remove the current barriers to JSpOC operations. JMS has been partitioned into several developmental increments. Increment 1, completed and operational in early 2013, and Increment 2, which is expected to be completed in 2016, will replace the legacy Space Defense Operations Center (SPADOC) and Astrodynamics Support Workstation (ASW) capabilities. In 2017 JMS Increment 3 will continue to provide additional SSA and C2 capabilities that will require development of new applications and procedures as well as the exploitation of new data sources. Most importantly, Increment 3 is uniquely postured to evolve the JSpOC into the centralized and authoritative source for all Space Control applications by using its SOA to aggregate information and capabilities from across the community. To achieve this goal, Scitor Corporation has supported the JMS Program Office as it has entered into a partnership with AFRL/RD (Directed Energy) and AFRL/RV (Space Vehicles) to create the Advanced Research, Collaboration, and Application Development Environment (ARCADE). The ARCADE formalizes capability development processes that hitherto have been ad hoc, slow to address the evolving space threat environment, and not easily repeatable. Therefore, the purpose of the ARCADE is to: (1) serve as a centralized testbed for all research and development (R&D) activities related to JMS applications, including algorithm development, data source exposure, service orchestration, and software services, and provide developers reciprocal access to relevant tools and data to accelerate technology development, (2) allow the JMS program to communicate user capability priorities and requirements to developers, (3) facilitate collaboration among developers who otherwise would not collaborate due to organizational, policy, or geographical barriers, and (4) support market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. Over the last several years Scitor Corporation has provided systems engineering support to the JMS Increment 3 Program Office, and has worked with AFRL/RV and AFRL/RD to create a high performance computing environment and SOA at both unclassified and classified levels that together allow developers to develop applications in an environment similar to the version of JMS currently in use by the JSpOC operators. Currently the ARCADE is operational in an unclassified environment via the High Performance Computing Modernization Program (HPCMP) Portal on DREN. The ARCADE also exists on SECRET and TOP SECRET environments on multiple networks. This presentation will cover the following topics: (1) Scitors role in shaping the ARCADE into its current form, (2) ARCADEs value proposition for potential technology developers, and (3) ARCADEs value proposition for the Government. These topics will be discussed by way of several case studies: a JMS Prototype activity, integration of the Search and Determine Integrated Environment (SADIE) system into the ARCADE, and developer challenge opportunities using the ARCADE. The contents of this presentation will be UNCLASSIFIED.

  9. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol.

    PubMed

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-09-02

    There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. The Voronoi spatio-temporal data structure

    NASA Astrophysics Data System (ADS)

    Mioc, Darka

    2002-04-01

    Current GIS models cannot integrate the temporal dimension of spatial data easily. Indeed, current GISs do not support incremental (local) addition and deletion of spatial objects, and they can not support the temporal evolution of spatial data. Spatio-temporal facilities would be very useful in many GIS applications: harvesting and forest planning, cadastre, urban and regional planning, and emergency planning. The spatio-temporal model that can overcome these problems is based on a topological model---the Voronoi data structure. Voronoi diagrams are irregular tessellations of space, that adapt to spatial objects and therefore they are a synthesis of raster and vector spatial data models. The main advantage of the Voronoi data structure is its local and sequential map updates, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define the complex operations. This resulted in a new formal model for spatio-temporal change representation, where each update is uniquely characterized by the numbers of newly created and inactivated Voronoi regions. This is used for the extension of the model towards the hierarchical Voronoi data structure. In this model, spatio-temporal changes induced by map updates are preserved in a hierarchical data structure that combines events and corresponding changes in topology. This hierarchical Voronoi data structure has an implicit time ordering of events visible through changes in topology, and it is equivalent to an event structure that can support temporal data without precise temporal information. This formal model of spatio-temporal change representation is currently applied to retroactive map updates and visualization of map evolution. It offers new possibilities in the domains of temporal GIS, transaction processing, spatio-temporal queries, spatio-temporal analysis, map animation and map visualization.

  11. Speech sound classification and detection of articulation disorders with support vector machines and wavelets.

    PubMed

    Georgoulas, George; Georgopoulos, Voula C; Stylios, Chrysostomos D

    2006-01-01

    This paper proposes a novel integrated methodology to extract features and classify speech sounds with intent to detect the possible existence of a speech articulation disorder in a speaker. Articulation, in effect, is the specific and characteristic way that an individual produces the speech sounds. A methodology to process the speech signal, extract features and finally classify the signal and detect articulation problems in a speaker is presented. The use of support vector machines (SVMs), for the classification of speech sounds and detection of articulation disorders is introduced. The proposed method is implemented on a data set where different sets of features and different schemes of SVMs are tested leading to satisfactory performance.

  12. HYBRID NEURAL NETWORK AND SUPPORT VECTOR MACHINE METHOD FOR OPTIMIZATION

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan (Inventor)

    2005-01-01

    System and method for optimization of a design associated with a response function, using a hybrid neural net and support vector machine (NN/SVM) analysis to minimize or maximize an objective function, optionally subject to one or more constraints. As a first example, the NN/SVM analysis is applied iteratively to design of an aerodynamic component, such as an airfoil shape, where the objective function measures deviation from a target pressure distribution on the perimeter of the aerodynamic component. As a second example, the NN/SVM analysis is applied to data classification of a sequence of data points in a multidimensional space. The NN/SVM analysis is also applied to data regression.

  13. Hybrid Neural Network and Support Vector Machine Method for Optimization

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan (Inventor)

    2007-01-01

    System and method for optimization of a design associated with a response function, using a hybrid neural net and support vector machine (NN/SVM) analysis to minimize or maximize an objective function, optionally subject to one or more constraints. As a first example, the NN/SVM analysis is applied iteratively to design of an aerodynamic component, such as an airfoil shape, where the objective function measures deviation from a target pressure distribution on the perimeter of the aerodynamic component. As a second example, the NN/SVM analysis is applied to data classification of a sequence of data points in a multidimensional space. The NN/SVM analysis is also applied to data regression.

  14. Prediction of mutagenic toxicity by combination of Recursive Partitioning and Support Vector Machines.

    PubMed

    Liao, Quan; Yao, Jianhua; Yuan, Shengang

    2007-05-01

    The study of prediction of toxicity is very important and necessary because measurement of toxicity is typically time-consuming and expensive. In this paper, Recursive Partitioning (RP) method was used to select descriptors. RP and Support Vector Machines (SVM) were used to construct structure-toxicity relationship models, RP model and SVM model, respectively. The performances of the two models are different. The prediction accuracies of the RP model are 80.2% for mutagenic compounds in MDL's toxicity database, 83.4% for compounds in CMC and 84.9% for agrochemicals in in-house database respectively. Those of SVM model are 81.4%, 87.0% and 87.3% respectively.

  15. StruLocPred: structure-based protein subcellular localisation prediction using multi-class support vector machine.

    PubMed

    Zhou, Wengang; Dickerson, Julie A

    2012-01-01

    Knowledge of protein subcellular locations can help decipher a protein's biological function. This work proposes new features: sequence-based: Hybrid Amino Acid Pair (HAAP) and two structure-based: Secondary Structural Element Composition (SSEC) and solvent accessibility state frequency. A multi-class Support Vector Machine is developed to predict the locations. Testing on two established data sets yields better prediction accuracies than the best available systems. Comparisons with existing methods show comparable results to ESLPred2. When StruLocPred is applied to the entire Arabidopsis proteome, over 77% of proteins with known locations match the prediction results. An implementation of this system is at http://wgzhou.ece. iastate.edu/StruLocPred/.

  16. Identification of cigarette smoke inhalations from wearable sensor data using a Support Vector Machine classifier.

    PubMed

    Lopez-Meyer, Paulo; Tiffany, Stephen; Sazonov, Edward

    2012-01-01

    This study presents a subject-independent model for detection of smoke inhalations from wearable sensors capturing characteristic hand-to-mouth gestures and changes in breathing patterns during cigarette smoking. Wearable sensors were used to detect the proximity of the hand to the mouth and to acquire the respiratory patterns. The waveforms of sensor signals were used as features to build a Support Vector Machine classification model. Across a data set of 20 enrolled participants, precision of correct identification of smoke inhalations was found to be >87%, and a resulting recall >80%. These results suggest that it is possible to analyze smoking behavior by means of a wearable and non-invasive sensor system.

  17. An Auto-flag Method of Radio Visibility Data Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Dai, Hui-mei; Mei, Ying; Wang, Wei; Deng, Hui; Wang, Feng

    2017-01-01

    The Mingantu Ultrawide Spectral Radioheliograph (MUSER) has entered a test observation stage. After the construction of the data acquisition and storage system, it is urgent to automatically flag and eliminate the abnormal visibility data so as to improve the imaging quality. In this paper, according to the observational records, we create a credible visibility set, and further obtain the corresponding flag model of visibility data by using the support vector machine (SVM) technique. The results show that the SVM is a robust approach to flag the MUSER visibility data, and can attain an accuracy of about 86%. Meanwhile, this method will not be affected by solar activities, such as flare eruptions.

  18. Detection of Dendritic Spines Using Wavelet Packet Entropy and Fuzzy Support Vector Machine.

    PubMed

    Wang, Shuihua; Li, Yang; Shao, Ying; Cattani, Carlo; Zhang, Yudong; Du, Sidan

    2017-01-01

    The morphology of dendritic spines is highly correlated with the neuron function. Therefore, it is of positive influence for the research of the dendritic spines. However, it is tried to manually label the spine types for statistical analysis. In this work, we proposed an approach based on the combination of wavelet contour analysis for the backbone detection, wavelet packet entropy, and fuzzy support vector machine for the spine classification. The experiments show that this approach is promising. The average detection accuracy of "MushRoom" achieves 97.3%, "Stubby" achieves 94.6%, and "Thin" achieves 97.2%. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Prediction on sunspot activity based on fuzzy information granulation and support vector machine

    NASA Astrophysics Data System (ADS)

    Peng, Lingling; Yan, Haisheng; Yang, Zhigang

    2018-04-01

    In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.

  20. An assessment of support vector machines for land cover classification

    USGS Publications Warehouse

    Huang, C.; Davis, L.S.; Townshend, J.R.G.

    2002-01-01

    The support vector machine (SVM) is a group of theoretically superior machine learning algorithms. It was found competitive with the best available machine learning algorithms in classifying high-dimensional data sets. This paper gives an introduction to the theoretical development of the SVM and an experimental evaluation of its accuracy, stability and training speed in deriving land cover classifications from satellite images. The SVM was compared to three other popular classifiers, including the maximum likelihood classifier (MLC), neural network classifiers (NNC) and decision tree classifiers (DTC). The impacts of kernel configuration on the performance of the SVM and of the selection of training data and input variables on the four classifiers were also evaluated in this experiment.

  1. The Scaling of Broadband Shock-Associated Noise with Increasing Temperature

    NASA Technical Reports Server (NTRS)

    Miller, Steven A.

    2012-01-01

    A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline ( = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline psi = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.

  2. Chromosome preference of disease genes and vectorization for the prediction of non-coding disease genes.

    PubMed

    Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan

    2017-10-03

    Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes.

  3. Chromosome preference of disease genes and vectorization for the prediction of non-coding disease genes

    PubMed Central

    Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan

    2017-01-01

    Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes. PMID:29108274

  4. Gene Therapy Vectors with Enhanced Transfection Based on Hydrogels Modified with Affinity Peptides

    PubMed Central

    Shepard, Jaclyn A.; Wesson, Paul J.; Wang, Christine E.; Stevans, Alyson C.; Holland, Samantha J.; Shikanov, Ariella; Grzybowski, Bartosz A.; Shea, Lonnie D.

    2011-01-01

    Regenerative strategies for damaged tissue aim to present biochemical cues that recruit and direct progenitor cell migration and differentiation. Hydrogels capable of localized gene delivery are being developed to provide a support for tissue growth, and as a versatile method to induce the expression of inductive proteins; however, the duration, level, and localization of expression isoften insufficient for regeneration. We thus investigated the modification of hydrogels with affinity peptides to enhance vector retention and increase transfection within the matrix. PEG hydrogels were modified with lysine-based repeats (K4, K8), which retained approximately 25% more vector than control peptides. Transfection increased 5- to 15-fold with K8 and K4 respectively, over the RDG control peptide. K8- and K4-modified hydrogels bound similar quantities of vector, yet the vector dissociation rate was reduced for K8, suggesting excessive binding that limited transfection. These hydrogels were subsequently applied to an in vitro co-culture model to induce NGF expression and promote neurite outgrowth. K4-modified hydrogels promoted maximal neurite outgrowth, likely due to retention of both the vector and the NGF. Thus, hydrogels modified with affinity peptides enhanced vector retention and increased gene delivery, and these hydrogels may provide a versatile scaffold for numerous regenerative medicine applications. PMID:21514659

  5. Separating generalized anxiety disorder from major depression using clinical, hormonal, and structural MRI data: A multimodal machine learning study.

    PubMed

    Hilbert, Kevin; Lueken, Ulrike; Muehlhan, Markus; Beesdo-Baum, Katja

    2017-03-01

    Generalized anxiety disorder (GAD) is difficult to recognize and hard to separate from major depression (MD) in clinical settings. Biomarkers might support diagnostic decisions. This study used machine learning on multimodal biobehavioral data from a sample of GAD, MD and healthy subjects to differentiate subjects with a disorder from healthy subjects (case-classification) and to differentiate GAD from MD (disorder-classification). Subjects with GAD ( n  = 19), MD without GAD ( n  = 14), and healthy comparison subjects ( n  = 24) were included. The sample was matched regarding age, sex, handedness and education and free of psychopharmacological medication. Binary support vector machines were used within a nested leave-one-out cross-validation framework. Clinical questionnaires, cortisol release, gray matter (GM), and white matter (WM) volumes were used as input data separately and in combination. Questionnaire data were well-suited for case-classification but not disorder-classification (accuracies: 96.40%, p  < .001; 56.58%, p  > .22). The opposite pattern was found for imaging data (case-classification GM/WM: 58.71%, p  = .09/43.18%, p  > .66; disorder-classification GM/WM: 68.05%, p  = .034/58.27%, p  > .15) and for cortisol data (38.02%, p  = .84; 74.60%, p  = .009). All data combined achieved 90.10% accuracy ( p  < .001) for case-classification and 67.46% accuracy ( p  = .0268) for disorder-classification. In line with previous evidence, classification of GAD was difficult using clinical questionnaire data alone. Particularly cortisol and GM volume data were able to provide incremental value for the classification of GAD. Findings suggest that neurobiological biomarkers are a useful target for further research to delineate their potential contribution to diagnostic processes.

  6. Underfed stoker boiler for burning bituminous coal and other solid fuel particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcotte, R.P.; Dumont, J.W. Jr.

    1987-10-06

    An automatic stoker boiler is described for space or process heating with steam or hot water. The boiler includes a heat transfer compartment having a water inlet and an outlet for steam or hot water, an exhaust, a combustion chamber, a transverse partition in the chamber, drive and driven shafts below the chamber, sprockets supported by the shaft and an endless belt of the link type trained about the sprockets. There are also means to deliver underfire air upwardly through the upper course. The upper portion has a throat opening adjacent to the second end, heat exchanging passageways extending throughmore » the compartment, means to deliver overfire air into the chamber, means to deliver solid fuel particles to the upper course adjacent to the first end, means in the exhaust operable to induce draft in the upper portion and control means operable to effect the advance of the belt. There are means operable to deliver solid fuel to the upper course in predetermined, proportional increments, means to vary the induced draft by predetermined, proportional increments and means to adjust the underfire air volume by predetermined, proportional increments.« less

  7. Incrementally developing a cultural and regulatory infrastructure for reusable launch vehicles

    NASA Astrophysics Data System (ADS)

    Simberg, Rand

    1998-01-01

    At this point in time, technology is perhaps the least significant barrier to the development of high-flight-rate, reusable launchers, necessary for low-cost space access. Much more daunting are the issues of regulatory regimes, needed markets, and public/investor perception of their feasibility. The approach currently the focus of the government (X-33) assumes that the necessary conditions will be in place to support a new reusable launch vehicle in the Shuttle class at the end of the X-33 development. For a number of reasons (market size, lack of confidence in the technology, regulations designed for expendable vehicles, difficulties in capital formation) such an approach may prove too rapid a leap for success. More incremental steps, both experimental and operational, could be a higher-probability path to achieving the goal of cheap access through reusables. Such incrementalism, via intermediate vehicles (possibly multi-stage) exploiting suborbital and smaller-payload markets, could provide the gradual acclimatization of the public, regulatory and investment communities to reusable launchers, and build the confidence necessary to go on to subsequent steps to provide truly cheap access, while providing lower-cost access much sooner.

  8. SAMS Acceleration Measurements on Mir (NASA Increment 4)

    NASA Technical Reports Server (NTRS)

    DeLombard, Richard

    1998-01-01

    During NASA Increment 4 (January to May 1997), about 5 gigabytes of acceleration data were collected by the Space Acceleration Measurements System (SAMS) onboard the Russian Space Station, Mir. The data were recorded on 28 optical disks which were returned to Earth on STS-84. During this increment, SAMS data were collected in the Priroda module to support the Mir Structural Dynamics Experiment (MiSDE), the Binary Colloidal Alloy Tests (BCAT), Angular Liquid Bridge (ALB), Candle Flames in Microgravity (CFM), Diffusion Controlled Apparatus Module (DCAM), Enhanced Dynamic Load Sensors (EDLS), Forced Flow Flame Spreading Test (FFFr), Liquid Metal Diffusion (LMD), Protein Crystal Growth in Dewar (PCG/Dewar), Queen's University Experiments in Liquid Diffusion (QUELD), and Technical Evaluation of MIM (TEM). This report points out some of the salient features of the microgravity environment to which these experiments were exposed. Also documented are mission events of interest such as the docked phase of STS-84 operations, a Progress engine bum, Soyuz vehicle docking and undocking, and Progress vehicle docking. This report presents an overview of the SAMS acceleration measurements recorded by 10 Hz and 100 Hz sensor heads. The analyses included herein complement those presented in previous summary reports prepared by the Principal Investigator Microgravity Services (PIMS) group.

  9. Comparative Effectiveness of Two Sight-Word Reading Interventions for a Student with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Mulé, Christina M.; Volpe, Robert J.; Fefer, Sarah; Leslie, Laurel K.; Luiselli, Jim

    2015-01-01

    Traditional drill and practice (TDP) and incremental rehearsal (IR) are flashcard drill techniques for teaching sight words to students. Although both have extensive research support, no study to date has compared these methods with children who have autism spectrum disorder (ASD). Utilizing an adaptive alternating treatments design, the present…

  10. Failure? Isn't It Time to Slay the Design-Dragon?

    ERIC Educational Resources Information Center

    Winkler, Dietmar R.

    2009-01-01

    There is a closed cycle of design education that replicates the most common design practice--and feeds into that practice that seeks awards based on incremental change supported by professional organizations and trade journals--that feeds back to education forms for imitation. This is the educational failure this paper cites. It takes to task the…

  11. Presenting Big Data in Google Earth with KML

    NASA Astrophysics Data System (ADS)

    Hagemark, B.

    2006-12-01

    KML 2.1 and Google Earth 4 provides support to enable streaming of very large datasets, with "smart" loading of data at multiple levels of resolution and incremental update to previously loaded data. This presentation demonstrates this technology for use with the Google Earth KML geometry and image primitives and shows some techniques and tools for creating this KML.

  12. Inquiring Scaffolds in Laboratory Tasks: An Instance of a "Worked Laboratory Guide Effect"?

    ERIC Educational Resources Information Center

    Schmidt-Borcherding, Florian; Hänze, Martin; Wodzinski, Rita; Rincke, Karsten

    2013-01-01

    The study explores if established support devices for paper-pencil problem solving, namely worked examples and incremental scaffolds, are applicable to laboratory tasks. N?=?173 grade eight students solved in dyads a physics laboratory task in one of three conditions. In condition A (unguided problem solving), students were asked to determine the…

  13. Active Reading Documents (ARDs): A Tool to Facilitate Meaningful Learning through Reading

    ERIC Educational Resources Information Center

    Dubas, Justin M.; Toledo, Santiago A.

    2015-01-01

    Presented here is a practical tool called the Active Reading Document (ARD) that can give students the necessary incentive to engage with the text/readings. By designing the tool to incrementally develop student understanding of the material through reading using Marzano's Taxonomy as a framework, the ARD offers support through scaffolding as they…

  14. 77 FR 9653 - Comment Sought on Potential Data for Connect America Fund Phase One Incremental Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... Internet by accessing the ECFS: http://fjallfoss.fcc.gov/ecfs2/ . Paper Filers: Parties who choose to file..., or via the Internet at http://www.bcpiweb.com . 1. On November 18, 2011, the Commission released the... things, the Commission established a transitional mechanism to distribute high cost universal service...

  15. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  16. Design of Clinical Support Systems Using Integrated Genetic Algorithm and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Fu; Huang, Yung-Fa; Jiang, Xiaoyi; Hsu, Yuan-Nian; Lin, Hsuan-Hung

    Clinical decision support system (CDSS) provides knowledge and specific information for clinicians to enhance diagnostic efficiency and improving healthcare quality. An appropriate CDSS can highly elevate patient safety, improve healthcare quality, and increase cost-effectiveness. Support vector machine (SVM) is believed to be superior to traditional statistical and neural network classifiers. However, it is critical to determine suitable combination of SVM parameters regarding classification performance. Genetic algorithm (GA) can find optimal solution within an acceptable time, and is faster than greedy algorithm with exhaustive searching strategy. By taking the advantage of GA in quickly selecting the salient features and adjusting SVM parameters, a method using integrated GA and SVM (IGS), which is different from the traditional method with GA used for feature selection and SVM for classification, was used to design CDSSs for prediction of successful ventilation weaning, diagnosis of patients with severe obstructive sleep apnea, and discrimination of different cell types form Pap smear. The results show that IGS is better than methods using SVM alone or linear discriminator.

  17. Prediction of Skin Sensitization with a Particle Swarm Optimized Support Vector Machine

    PubMed Central

    Yuan, Hua; Huang, Jianping; Cao, Chenzhong

    2009-01-01

    Skin sensitization is the most commonly reported occupational illness, causing much suffering to a wide range of people. Identification and labeling of environmental allergens is urgently required to protect people from skin sensitization. The guinea pig maximization test (GPMT) and murine local lymph node assay (LLNA) are the two most important in vivo models for identification of skin sensitizers. In order to reduce the number of animal tests, quantitative structure-activity relationships (QSARs) are strongly encouraged in the assessment of skin sensitization of chemicals. This paper has investigated the skin sensitization potential of 162 compounds with LLNA results and 92 compounds with GPMT results using a support vector machine. A particle swarm optimization algorithm was implemented for feature selection from a large number of molecular descriptors calculated by Dragon. For the LLNA data set, the classification accuracies are 95.37% and 88.89% for the training and the test sets, respectively. For the GPMT data set, the classification accuracies are 91.80% and 90.32% for the training and the test sets, respectively. The classification performances were greatly improved compared to those reported in the literature, indicating that the support vector machine optimized by particle swarm in this paper is competent for the identification of skin sensitizers. PMID:19742136

  18. Pharmaceutical Raw Material Identification Using Miniature Near-Infrared (MicroNIR) Spectroscopy and Supervised Pattern Recognition Using Support Vector Machine

    PubMed Central

    Hsiung, Chang; Pederson, Christopher G.; Zou, Peng; Smith, Valton; von Gunten, Marc; O’Brien, Nada A.

    2016-01-01

    Near-infrared spectroscopy as a rapid and non-destructive analytical technique offers great advantages for pharmaceutical raw material identification (RMID) to fulfill the quality and safety requirements in pharmaceutical industry. In this study, we demonstrated the use of portable miniature near-infrared (MicroNIR) spectrometers for NIR-based pharmaceutical RMID and solved two challenges in this area, model transferability and large-scale classification, with the aid of support vector machine (SVM) modeling. We used a set of 19 pharmaceutical compounds including various active pharmaceutical ingredients (APIs) and excipients and six MicroNIR spectrometers to test model transferability. For the test of large-scale classification, we used another set of 253 pharmaceutical compounds comprised of both chemically and physically different APIs and excipients. We compared SVM with conventional chemometric modeling techniques, including soft independent modeling of class analogy, partial least squares discriminant analysis, linear discriminant analysis, and quadratic discriminant analysis. Support vector machine modeling using a linear kernel, especially when combined with a hierarchical scheme, exhibited excellent performance in both model transferability and large-scale classification. Hence, ultra-compact, portable and robust MicroNIR spectrometers coupled with SVM modeling can make on-site and in situ pharmaceutical RMID for large-volume applications highly achievable. PMID:27029624

  19. Emergency Department Visit Forecasting and Dynamic Nursing Staff Allocation Using Machine Learning Techniques With Readily Available Open-Source Software.

    PubMed

    Zlotnik, Alexander; Gallardo-Antolín, Ascensión; Cuchí Alfaro, Miguel; Pérez Pérez, María Carmen; Montero Martínez, Juan Manuel

    2015-08-01

    Although emergency department visit forecasting can be of use for nurse staff planning, previous research has focused on models that lacked sufficient resolution and realistic error metrics for these predictions to be applied in practice. Using data from a 1100-bed specialized care hospital with 553,000 patients assigned to its healthcare area, forecasts with different prediction horizons, from 2 to 24 weeks ahead, with an 8-hour granularity, using support vector regression, M5P, and stratified average time-series models were generated with an open-source software package. As overstaffing and understaffing errors have different implications, error metrics and potential personnel monetary savings were calculated with a custom validation scheme, which simulated subsequent generation of predictions during a 4-year period. Results were then compared with a generalized estimating equation regression. Support vector regression and M5P models were found to be superior to the stratified average model with a 95% confidence interval. Our findings suggest that medium and severe understaffing situations could be reduced in more than an order of magnitude and average yearly savings of up to €683,500 could be achieved if dynamic nursing staff allocation was performed with support vector regression instead of the static staffing levels currently in use.

  20. A Fast SVM-Based Tongue's Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis.

    PubMed

    Kamarudin, Nur Diyana; Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori

    2017-01-01

    In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k -means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k -means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.

Top